This is a story about the power of redemption. Everyone makes mistakes in life and sometimes those mistakes warrant blocking someone on Facebook.
You’d better mean business, because when you block someone, your Facebook friendship is automatically ended. But if you are suddenly possessed by the spirit of generosity, there is a way to forgive and forget. It’s called unblocking someone on Facebook.
To be clear, unblocking someone means that they can see what you post publicly, but it does not mean that you are now Facebook friends with them. To do that, you must go the extra step of sending them a friend request. Facebook’s help center has more info on adding friends and blocking.
To unblock someone on Facebook, click the little downward-facing arrow in the top right corner of the screen.
In the drop-down, click “Settings & Privacy” then “Settings.”
That will take you to a new page with all of your account settings. In the lefthand column, click on “Blocking.” That will take you to the section where you can manage users, messages, pages, etc. that you’ve blocked.
In the “Block users” subsection, you can see a list of everyone that you’ve blocked. Next to the name of each blocked user is a button saying “Unblock.”
Click that to unblock someone, and then click “Confirm” in the window that pops up to complete the action.
Reblogged 1 week ago from feeds.mashable.com
Three years ago, I wrote a post for the Moz Blog advising how the latest news on mobile-first indexing would impact internal linking strategies, particularly for larger sites.
“By now, you’ve probably heard as much as you can bear about mobile first indexing”, I joked in my introduction. Little did I know.
Only now — in the summer of 2021 — are Google, supposedly, maybe, finalizing the rollout of mobile-first. Even as of August 2021, Google is still very much actively crawling sites with Googlebot desktop*.
As with the recent delays to the Core Web Vitals rollout, the issue here for Google is that they can’t push changes which make their results worse. As Mike King pointed out back in March over at iPullRank, there’s still a big disparity between the mobile and desktop versions of the web, especially when it comes to links.
I don’t need to persuade most SEOs that they should care about links, but I maybe do need to remind you that internal links are, for most pages, a much bigger part of how they get their strength than external links. On an even vaguely established site, it’s not unreasonable to think that including a landing page in your top nav is going to generate more impactful links than most digital PR campaigns could ever hope to. And yet, sites tend to focus disproportionately on the latter, which is perhaps what brings us to this conundrum today.
In this post, I’m going to point out some of the common causes of disparities between mobile and desktop internal linking, when you should care, and what you can do to fix these issues without throwing UX under the bus.
*(thanks to Dom Woodman and the wealth of data at his fingertips for confirming for me that this is still the case!)
Back in 2015, SEOs had two months’ warning to prepare for what the industry nicknamed “Mobilegeddon”. This wasn’t the first time that Google had factored mobile friendliness into its rankings, but it was probably the first time they tried to make a really big deal out of it as a way of steering webmasters — a sign of things to come.
About 18 months later, in November 2016, we got the phrase “Mobile-first indexing”. Over the next few years, SEOs with access to multiple Search Console properties became familiar with the routine trickle of emails informing them of sites moving over to the new paradigm.
During this period, some SEOs, including the late Russ Jones, myself in the aforementioned post on the Moz Blog, and my old boss Will Critchlow, started to voice concerns about the potential impact on the linkgraph:
The overall impression at the time was that Google was using a hybrid index for now, but that “mobile only” was already on its way.
Fast forward to March 2020, and Google warned we had six months to prepare for the final toll of the desktop index. This initially suggested a September 2020 rollout, then that became March 2021, and then, as I’ve mentioned above, that date too seemed to pass without incident.
We should assume, though, that this is still coming, or perhaps largely already here, and as such that our mobile sites need to present the version of truth we want Google to see.
Internal links, like all other links, fulfill multiple vital functions:
Allowing search engines to discover new URLs
Passing on clues as to topical relevance, via their anchor text, and source URL
Passing on authority, via PageRank or equivalent
That’s of course without even getting into their roles in user experience, which is a topic for another post. (Although if you want to learn more about internal links, I recommend this Whiteboard Friday.)
A disparity in internal links between desktop and mobile versions, then, is likely to have far-reaching implications. (This also goes for any other two versions, such as rendered and raw HTML.) In most cases, one of the two versions will be the one that the site’s SEO practitioner(s) were happy with, and as such the other will not be.
At this point it’s common best practice, at least for your major templates, to routinely produce a list of links from both versions of the page and look for discrepancies.
That said, some differences are more impactful than others. For illustrative purposes, I’ve compared the desktop and mobile versions of five homepages, and in the rest of this post I’ll discuss some of the more interesting differences I noted, and what I’d recommend to the respective sites. Just to be clear: I am not involved with, or indeed pitching, any of these sites.
The five homepages I looked at were:
https://www.amazon.co.uk/ — the UK site of the global e-com juggernaut
https://www.optimizely.com/ — the well known CRO software
https://www.ebuyer.com/ — an electronics e-commerce site
https://www.zoopla.co.uk/ — a UK real estate site, similar to the US’s Zillow
https://www.nytimes.com/ — an American broadsheet newspaper
Interestingly, of these, two had no differences at all for us to discuss — congratulations to Optimizely and Zoopla for paying attention back in 2018. For the other three, read on…
The Amazon UK homepage links to itself no fewer than six times, with anchor text such as “back to top”, “see product details”, and “next page” (within a carousel). These links are all unique to desktop, although the mobile version does have a “Top of page” link instead of the “Back to top” link.
Amazon UK desktop (top) vs. Amazon UK mobile (bottom)
You probably don’t need to be too concerned about links like these from an SEO perspective. There’s no dramatic difference in optimization or targeting implied by the different text, and pages linking to themselves probably aren’t going to reshape the linkgraph.
Amazon UK desktop (top) vs. Amazon UK mobile (bottom)
The main nav link to the “Pet supplies” category on the Amazon UK homepage comes with different internal tracking tags on mobile vs. desktop:
However, from a specific mobile/desktop parity point of view, this isn’t a big deal. As I said, they both share a canonical tag pointing to the same place, so we end up with equivalent behavior.
A similar rule applies when linking to pages like “my account” or “basket” — there may be differences in desktop and mobile implementations, but as both pages are noindex and/or robots.txt blocked, it isn’t a big deal.
Ebuyer has a few instances of the same element using different anchor text on mobile vs. desktop:
Ebuyer desktop (top) vs. Ebuyer mobile (bottom)
Note the longer anchor text on mobile(!). I also noticed something similar on the New York Times site, although that may be due to them rapidly testing different headline variants.
Either way, I don’t think this is a huge deal as long as the behavior is intended and the implied topic is largely similar, which it is in these cases.
One of the most common causes of disparity is navigation elements that are desktop-only. The example below is from Ebuyer, and shows a bunch of links that I was unable to find anywhere on their mobile homepage.
These links all point to URLs that also feature in the top-nav, so the impact on the link graph may not be huge. However, Google is likely to place different weightings on a prominent homepage link like this vs. a link buried in a navigation, so there are SEO implications to this disparity. Ebuyer’s desktop site implies that these are some of the most important subcategories on the site, whereas their mobile site gives them a more equal footing with other subcategories in the mega-menu.
Happening across millions of sites, this is the sort of issue that might impact the quality of Google’s results. Ebuyer has presumably featured here the categories that are core to their business, and if they rank slightly better in these cases than in other cases, that means Google is slightly more likely to show people results from a business that is highly competent in that area. That, from Google’s perspective, is surely a win, but one they miss out on by exclusively using the mobile version.
From Ebuyer’s point of view, the choice of what to feature in this element is a strategic lever that is lost when Google stops counting their desktop links. The only real solution here is to develop a mobile equivalent to this element, but one can be creative. It could be somewhere slightly different on the page, for example, or it could be a carousel on mobile but static on desktop. Alternatively, you can accept that this is a desktop-specific UX element that should be disregarded in any SEO consideration, and instead must justify itself through its benefit to conversion rates.
Many sites, especially e-commerce, handle internal linking by having a huge mega-menu on desktop that collapses into a hamburger menu perhaps four layers deep on mobile. This leaves users very many clicks from anything they might hope to find, and the ironic thing is that super-exhaustive top navigations aren’t necessarily optimal from an SEO perspective either. Sure, they get a lot of pages crawled and pass on a little equity, but they do nothing to concentrate relevance around subtopics, and they don’t allow you to focus your strength where it’s most needed.
Some sites improve on this with a section-specific subnavigation, for example these links on Amazon that only appear within the Grocery section:
This is a great alternative to a mega-menu in general, in that there are fewer sitewide links (meaning that each remaining sitewide link is a little stronger), and, proportionately, more links between closely related pages.
However, of course, this element doesn’t appear at all on mobile. D’oh.
Similarly, Amazon has these featured subcategories on desktop, performing a similar role:
Again, I’d say this is a great idea from an SEO perspective, but these links don’t exist on mobile.
Zoopla handles the same issue much more neatly:
Sidebar links to relevant subcategories
They similarly have subcategory links that only feature in the relevant category, but then on mobile, they retain them — just moving them to the bottom of the page instead of a sidebar:
Sidebar links shuffled to bottom of content on mobile
This isn’t hugely attractive, but it doesn’t matter — few people will scroll to these depths anyway, and Zoopla’s SEO strategy is robust to the mobile-only index as a result. Plus, because of the focus on interlinking only relevant subcategories, the volume of links here isn’t extreme.
A similar argument could be made for Ebuyer’s treatment of SEO copy here:
It’s right at the bottom of the page, so perhaps this is an opportunity for internal linking? Indeed, there are a couple of links at the end of this block of text.
Without going too much into the benefits and drawbacks of this kind of copy in general, I’d say this is a little excessive for the bottom of an e-commerce category page (you can only see a fraction in the screenshot above). Instead, Ebuyer could do something similar to what they’ve done with their footer:
Collapsed or tabbed content can be a great way to handle bulky internal linking structures on mobile
On desktop, all of these footer sections are expanded by default, and all visible. On mobile, they’re hidden in these expandable sections. This is generally a good way to handle SEO elements on mobile, as Google has said repeatedly at this point that there’s no downside to doing this.
I’ve tried to explore here some of the common issues that sites face when aiming for mobile/desktop linking parity.
To quickly recap, the main issues I recommend sites focus on are:
Missing navigation elements
Opportunities for deep-linking without resorting to mega-menus
And my suggested solutions are:
Pushing linking widgets to the bottom of the page on mobile, rather than removing them altogether
Using tabs, carousels, expandable sections and other creative solutions to make better use of on-screen real estate
I’m keen to see more examples in the wild, though — how is your site handling mobile-first internal linking? Tell me on Twitter!
Reblogged 1 week ago from feedproxy.google.com
For so many of us, the pandemic has led to an increase in online shopping—we’ve been stuck inside, so of course, we’re scrolling through our favorite eCommerce sites trying to find a hint of joy. It’s a totally natural coping mechanism for these uncertain times. But it wasn’t expected. Retail businesses and shipping companies across…
The post How Email Marketing Can Be Used to Ease Customers During Supply Shortages appeared first on Benchmark Email.Reblogged 1 week ago from www.benchmarkemail.com
As is the case with many other developers, the reports over the last few years of the huge energy requirements of the web have prompted me to take a look at my own websites and see what I can do to minimize their impact. This piece will cover some of my experiences in doing this, as well as my current thoughts on optimizing websites for carbon emissions, and some practical examples of things you can do to improve your own pages.
But first, a confession: When I first heard about the environmental impact of websites, I didn’t quite believe it. After all, digital is supposed to be better for the planet, isn’t it?
I’ve been involved in various green and environmental groups for decades. In all of that time, I can’t consciously remember anyone ever discussing the possible environmental impacts of the web. The focus was always on reducing consumption and moving away from burning fossil fuels. The only time the Internet was mentioned was as a tool for communicating with one another without the need to chop down more trees, or for working without a commute.
So, when people first started talking about the Internet having similar carbon emissions to the airline industry, I was a bit skeptical.
It can be hard to visualize the huge network of hardware that allows you to send a request for a page to a server and then receive a response back. Most of us don’t live in data centers, and the cables that carry the signals from one computer to another are often buried beneath our feet. When you can’t see a process in action, the whole thing can feel a little bit like magic — something that isn’t helped by the insistence of certain companies on adding words like “cloud” and “serverless” to their product names.
As a result of this, my view of the Internet for a long time was a little ephemeral, a sort of mirage. When I started writing this article, however, I performed a little thought experiment: How many pieces of hardware does a signal travel through from the computer I’m writing at to get outside of the house?
The answer was quite shocking: 3 cat cables, a switch, 2 powerline adapters, a router and modem, an RJ11 cable, and several meters of electrical wiring. Suddenly, that mirage was beginning to look rather more solid.
Of course, the web (any, by extension, the websites we make) does have a carbon footprint. All of the servers, routers, switches, modems, repeaters, telephone cabinets, optical-to-electrical converters, and satellite uplinks of the Internet must be built from metals extracted from the Earth, and from plastics refined from crude oil. To then provide data to the estimated 20 billion connected devices worldwide, they need to consume electricity, which also releases carbon when it is generated (even renewable electricity is not carbon neutral, although it is a lot better than fossil fuels).
Accurately measuring just what those emissions are is probably impossible — each device is different and the energy that powers them can vary over the course of a day — but we can get a rough idea by looking at typical figures for power consumption, user bases, and so on. One tool that uses this data to estimate the carbon emissions of a single page is the Website Carbon Calculator. According to it, the average page tested “produces 1.76 grams of CO2 per page view”.
If you’ve been used to thinking about the work you do as essentially harmless to the environment, this realization can be quite disheartening. The good news is that, as developers, we can do an awful lot about it.
Recommended reading: How Improving Website Performance Can Help Save The Planet
Performance And Emissions
If we remember that viewing websites uses electricity and that producing electricity releases carbon, then we’ll know that the emissions of a page must heavily depend on the amount of work that both the server and client have to perform in order to display the page. Also, the amount of data that is required for the page, and the complexity of the route it must travel through, will determine the amount of carbon released by the network itself.
For example, downloading and rendering example.com will likely consume far less electricity than Apple’s home page, and it will also be much quicker. In effect, what we are saying is that high emissions and slow page loads are just two symptoms of the same underlying causes.
It’s all very well to talk about this relationship in theory, of course, but having some real-world data to back it up would be nice. To do just that, I decided to conduct a little study. I wrote a simple command-line interface program to take a list of the 500 most popular websites on the Internet, according to MOZ, and check their home pages against both Google’s PageSpeed Insights and the Website Carbon Calculator.
Some of the checks timed out (often because the page in question simply took too long to load), but in total, I managed to collect results for over 400 pages on 14 July 2021. You can download the summary of results to examine yourself but to provide a visual indication, I have plotted them in the chart below:
As you can see, while the variation between individual websites is very high, there is a strong trend towards lower emissions from faster pages. The mean average emissions for websites with a PageSpeed score of 100 is about 1 gram of carbon, which rises to a projected almost 6 grams for websites with a score of 0. I find it slightly reassuring that, despite there being many websites with very low speeds and high emissions, most of the results are clustered in the bottom right of the chart.
Once we understand that much of a page’s emissions originate from poor performance, we can start taking steps to reduce them. Many of the things that contribute to a website’s emissions are beyond our control as developers. We can’t, for example, choose the devices that our users access our pages from or decide on the network infrastructure that their requests travel through, but we can take steps to improve our websites’ performance.
Performance optimization is a broad topic, and many of you reading this likely have more experience than I do, but I would like to briefly mention a few things that I have observed recently when optimizing various pages’ loading speed and carbon emissions.
I recently reworked the design of my personal blog in order to make it a little more user-friendly. One of my hobbies is photography, and the website had previously featured a full-height header image.
While the design did a good job of showcasing my photographs, it was a complete pain to scroll past, especially when moving through pages of blog posts. I didn’t want to lose the feeling of having a photo in the header, however, and eventually settled on using it as a background for the page title.
The full-height header had been using
srcset in order to make loading as fast as possible, but the images were still very big on high-resolution screens, and my longest contentful paint (LCP) time on mobile for the old design was almost 3 seconds. A big advantage of the new design was that it allowed me to make the images much smaller, which reduced the LCP time to about 1.5 seconds.
On laptops and desktops, people wouldn’t have noticed a difference, because both versions were well under a second, but on much less powerful mobile devices, it was quite dramatic. What was the effect of this change on carbon emissions? 0.31 grams per view before, 0.05 grams after. Decoding and rendering images is very resource-intensive, and this grows exponentially as the images get bigger.
The size of images isn’t the only thing that can have an impact on the time to decode; the format is important as well. Google’s Lighthouse often recommends serving images in next-generation formats to reduce the amount of data that needs to be downloaded, but new formats are often slower to decode, especially on mobile. Sending less data over the wire is better for the environment, but it is possible that consuming more energy to decode could offset that benefit. As with most things, testing is key here.
From my own testing in trying to add support for AVIF encoding to the Zola static site generator, I found that AVIF, which promises much smaller file sizes than JPG at the same quality, took orders of magnitude longer to encode; something that bunny.net’s observation that WebP outperforms AVIF by as much as 100 times supports. While doing this, the server will be consuming electricity, and I do wonder whether, for websites with low visitor counts, switching to the new format might actually end up increasing emissions and reducing performance.
Recommended reading: The Humble
img Element And Core Web Vitals
Another thing that can have a surprising impact on performance and emissions is where your data is coming from. Conventional wisdom has long said that serving assets such as frameworks from a central content delivery network (CDN) will improve performance because getting data from local nodes is generally faster for users than from a central server. jQuery, for example, has the option to be loaded from a CDN, and its maintainers say that this can improve performance, but real-world testing by Harry Roberts has shown that self-hosting assets is generally faster.
This has also been my experience. I recently helped a gaming website improve its performance. The website was using a fairly large CSS framework and loading all of its third-party assets via a CDN. We switched to self-hosting all assets and removed unused components from the framework.
None of the optimizations resulted in any visual changes to the website, but together they increased the Lighthouse score from 72 to 98 and reduced carbon emissions from 0.26 grams per view to 0.15.
This leads nicely onto the subject of sending users only the data they actually need. I’ve worked on (and visited) many, many websites that are dominated by stock images of people in suits smiling at one another. There seems to be a mentality amongst certain organizations that what they do is really boring and that adding photos will somehow convince the general public otherwise.
I can sort of understand the thinking behind this because there are numerous pieces on how the amount of time people spend reading is declining. Text, we are repeatedly told, is going out of fashion; all people are interested in now are videos and interactive experiences.
From that point of view, stock photos could be seen as a useful tool to liven up pages, but eye-tracking studies show that people ignore images that aren’t relevant. When people aren’t looking at your images, the images might as well be empty space. And when every byte costs money, contributes to climate change, and slows down loading times, it would be better for everyone if they actually were.
Again, what can be said for images can be said for everything else that isn’t the page’s core content. If something isn’t contributing to a user’s experience in a meaningful way, it shouldn’t be there. I’m not for a moment advocating that we all start serving unstyled pages — some people, such as those with dyslexia, do find large blocks of text difficult to read, and other users almost certainly would find such pages boring and go elsewhere — but we should look critically at every part of our websites to consider whether they are earning their keep.
Accessibility and the Environment
Another area where performance and emissions converge is in the field of accessibility. There is a common misconception that making websites accessible involves adding
MDN Web Docs has some very good tutorials on accessibility. In “HTML: A Good Basis for Accessibility”, they cover how the best foundation of an accessible website lies in using the correct HTML elements for the content. One of the most interesting sections of the article is where they try to recreate the functionality of a
button element using a
This is obviously a minimal example, but I thought it would be interesting to compare the size of this button version to one using standard HTML elements. The fake button example in this case weighs around 1,403 bytes uncompressed, whereas an actual
div button will also be semantically meaningless and, therefore, much harder for people with screen readers to use and for bots to parse.
Recommended reading: Accessible SVGs: Perfect Patterns For Screen Reader Users
On a larger scale, I was recently refactoring the HTML of a website that I work on — doing things like removing redundant title attributes and replacing
divs with more semantic equivalents. The original page had a structure like the following (content removed for privacy and brevity):
<div class="container"> <section> <div class="row"> <div class="col-md-3"> <aside> <!-- Sidebar content here --> </aside> </div> <div class="col-md-9"> <!-- Main content here --> <h4>Content piece heading</h4> <p> Some items;<br> Item 1 <br> Item 2 <br> Item 3 <br> <br> </p> <!-- More main content here --> </div> </div> </section> </div>
With the full content, this weighed 34,168 bytes.
After refactoring, the structure resembled this:
<div class="container"> <div class="row"> <main class="col-md-9 col-md-push-3"> <!-- Main content here --> <h3>Content piece heading</h3> <p>Some items;</p> <ul> <li>Item 1</li> <li>Item 2</li> <li>Item 3</li> </ul> <!-- More main content here --> </main> <aside class="col-md-3 col-md-pull-9"> <!-- Sidebar content here --> </aside> </div> </div>
It weighed 32,805 bytes.
The changes are currently ongoing, but already the markup is far more accessible according to WebAIM, Lighthouse, and manual testing. The file size has also gone down, and, when averaging the time from five profiles in Chrome, the time to parse the HTML has dropped by about 2 milliseconds.
These are obviously small changes and probably won’t make any perceptual difference to users. However, it is nice to know that every byte costs users and the environment — making a website accessible can also make it a little bit lighter as well.
Project Gutenberg’s HTML version of The Complete Works of William Shakespeare is approximately 7.4 MB uncompressed. According to Android Authority in “How Much Data Does YouTube Actually Use?”, a 360p YouTube video weighs about 5 to 7.5 MB per minute of footage and 1080p about 50 to 68. So, for the same amount of bandwidth as all of Shakespeare’s plays, you will get only about 7 seconds of high-definition video. Video is also very intensive to encode and decode, and this is probably a major contributing factor to estimates of Netflix’s carbon emissions being as high as 3.2 KG per hour.
Most videos rely on both visual and auditory components to communicate their message, and large files sizes require a certain level of connectivity. This obviously places limits on who can benefit from such content. Making video accessible is possible but far from simple, and many websites simply don’t bother.
If video was only ever treated as a form of progressive enhancement, this would perhaps not be a problem, but I have lost count of the number of times I have been searching for something on the web, and the only way to find the information I wanted was by watching a video. On YouTube, the average number of monthly users grew from 20 million in 2006 to 2 billion in 2020. Vimeo also has a continually growing user base.
Despite the huge number of visitors to video-sharing websites, many of the most popular ones do not seem to be fully compliant with accessibility legislation. In contrast to this, numerous types of assistive technologies are designed to make plain text accessible to as wide a variety of people as possible. Text is also easy to convert from one format to another, so it can be used in a number of different contexts.
As we can see from the example of Shakespeare, plain text is also incredibly space-efficient, and it has a far lower carbon footprint than any other form of human-friendly information transmitted on the web.
Video can be great, and many people learn best by watching a process in action, but it also leaves some people out and has an environmental cost. To keep our websites as lightweight and inclusive as possible, we should treat text as the primary form of communication wherever possible, and offer things such as audio and video as an extra.
Recommended Reading: Optimizing Video For Size And Quality
Hopefully, this brief look at my experience in trying to make websites better for the environment has given you some ideas for things to try on your own websites. It can be quite disheartening to run a page through the Website Carbon Calculator and be told that it could be emitting hundreds of kilograms of CO2 a year. Fortunately, the sheer size of the web can amplify positive changes as well as negative ones, and even small improvements soon add up on websites with thousands of visitors a week.
Even though we are seeing things like a 25-year-old website increasing 39 times in size after a redesign, we are also seeing websites being made to use as little data as possible, and clever people are figuring out how to deliver WordPress in 7 KB. So, in order for us to reduce the carbon emissions of our websites, we need to make them faster — and that benefits everybody.
It seems like every month, there’s a new social media channel that pops up. Should you create a TikTok account? Clubhouse? Would Twitter Spaces work for your business? It can be tempting to be active on all the available social media channels but is that the right course of action?
Instead, it’s best to strategically choose the right social media channels for your business. Going through the process of creating a social media strategy will help you determine which channels are best for your business. There are also several points for you to consider before you hit that create account button.
The short answer is yes but the long one is more complicated. It depends.
Depends on what? A number of factors go into deciding if your business should hop onto a social media channel.
If your goals, audience and KPIs all align without social media, then you may be doing fine without it. But if you want to increase exposure or reach new and current customers, then you should consider at least one social media channel.
A recent Sprout Social Index™ report found that consumers prefer to use social media for sharing feedback about a product (31%) and reaching out with a customer service issue or question (33%). On the marketing end, marketers use social media to collect data in a number of ways. From the same study, 88% of marketers report that their social media strategy positively influenced their sales and 90% agree that social media helps them stay ahead of their competitors.
So the long answer to “does your business need a social media channel?” is yes, if you want to build up your community, increase brand awareness, generate sales leads and keep tabs on your competitors.
Let’s take a look at what you should consider when choosing a social media channel.
Different platforms offer different advantages. Oftentimes, social media goals align with your overall business goals. So when you’re setting your social media goals, a few platforms will stand out as the best ones to reach those goals.
According to marketers, their top goals in social are to:
Once you’ve identified your top goals, write them down to reference as you consider your channel choices.
Let’s be honest. Social media takes time and effort. Between creating content and scheduling posts, there are strategies to consider and new features to keep up with.
Reflect on these questions:
Resources aren’t limited to people or time, either. Within the cost calculation of your social media ROI includes the software that you use to post and analyze. There is a wide range of digital marketing tools available. The question for you to think about is if they will be included in your resources.
Some companies already have an established media library. That’s great. That’s visual content for you to repurpose into social media content. You might also have blog posts already written. That’s even more content that you can use on social.
Take stock of the existing content you have and if you’ll be able to create more content. But consider how and who will be creating that new content? Will the social media manager also be the social media photographer? Are videos being outsourced?
There are five main types of social media content: video, images, text, stories and live video. The content that you create and curate will directly influence your social media channel decision.
Are you able to upload a lot of video content? Then Facebook, YouTube and Instagram should be at the top of your list. Yes, it might be very tempting to jump into TikTok but if you aren’t able to create the video content that is necessary for the platform, then this channel is not for you.
It’s best to be realistic because content curation and creation take time. When you start out on a new network, you want to make sure you have the right type and amount of content to post on a consistent basis.
Even if you aren’t present on a social media channel, chances are that your customers are. Check your website data to see where your referrals are coming from. What are they clicking on and where are they clicking from? Knowing that you have a customer base already on a channel makes it easier for you when you establish your brand presence.
One exercise you can do is to define your target audience. Once you have that identified, you can match them up with current statistics for the many various social media platforms. Some demographics are more present on certain channels than others. Having this data on hand will help you in deciding on a social media channel.
While Facebook, Instagram and YouTube are the top social platforms for brands and consumers, it doesn’t mean that those are your go-to channels. Channel popularity varies between industries. So while these benchmarks for overall brands and consumer behaviors are a good place to start, you’ll need to do your own research for your industry.
To get an idea of your industry’s presence on different networks, start with a competitor analysis. After that, review some industry benchmarks to see what has generally worked per channel.
For example, the average sports industry company publishes 42 posts a day across Facebook, Instagram and Twitter. With 227 messages received on a daily basis, that gives you a rough idea of what to expect for engagement and you can plan your staffing accordingly. Of course, if you’re just starting out, you won’t have these numbers. But that means you now have some specific numbers to input for your social media goals.
Take a moment and research your competitors’ social media profiles. Where are they posting and what are they posting about? What types of posts are working for them? What are the comments saying about them? Answers to these questions will inform your understanding of how your competitors are performing.
Social media is great for competitor analysis. The top two ways that marketers use social data is that it tells them the strength of a customer’s loyalty and it reveals the strengths and weaknesses of competitors’ products or services. When your competitors are on a certain channel and performing well, it’s a good sign that you should be, too.
An easy way to keep track of your competitors’ performance is to use a competitor report like the one offered by Sprout Social. It’ll match your performance and growth against the competitors you add in an easy-to-read graph.
All things considered, you might have a good idea of which social media channels you want to focus on now. And you might even know who on your team or in your business would be managing all of them. The next question to ask yourself is what tools you’ll be using.
Scheduling, analytics, previews and engagement management are all typical features of social media management software. Some companies focus on just a few networks while others like Sprout include top social media platforms as well as review management platforms such as Google My Business and TripAdvisor. The more channels you’re on, the more time you’ll spend managing them.
If you’re wondering if an all-in-one solution would help you reclaim your time, the answer is yes. Management becomes easier, especially if you have an inbox that compiles all your engagements into one view. No more jumping between networks. You can respond to a Facebook review and look at the most recently tagged photo on Instagram all from the Smart Inbox.
There are over 15 social media platforms for you to choose from. It’s overwhelming, to say the least. How will you choose the right ones for you? Ask yourself the above seven questions to narrow down your choices.
In the last year, consumer social media use has increased across all generations. If you haven’t already, it’s time to invest in social media. And if you’re hesitant about adding more channels, the data shows that usage and spending will continue to rise. You won’t want to be left behind.
Explore how you can improve on your social strategy with a personalized demo from Sprout Social.
The post Choosing the right social media channels for your business appeared first on Sprout Social.
Reblogged 2 weeks ago from feedproxy.google.com
You may already be familiar with STAT Search Analytics and its rank tracking abilities, but did you know it can also help you discover SEO opportunities on a massive scale? In today’s Whiteboard Friday, Cyrus shows you how to dig into STAT to do just that.
Hi, everybody. Welcome. My name is Cyrus. Today the thing I want to talk about is how to use STAT to find SEO opportunities at scale, and I mean massive scale.
Now a lot of you have probably heard of STAT. You may know that it has an excellent reputation. But it’s possible you haven’t actually used it or have a very good understanding of what it actually does.
So that’s what I’m going to try to cover today and explain how powerful it is at discovering SEO opportunities in ways that can inform content strategy, competitive analysis, and a lot more.
So STAT, the full name of STAT is actually STAT Search Analytics. On the surface, what a lot of people understand is that it is a rank tracker, tracking thousands of keywords at a time anywhere across the globe. But underneath the hood, it’s actually a lot more than a rank tracker. It’s a rank tracker. It’s a competitive landscape tool. It’s SERP analysis and intent. It allows you to do some pretty incredible things once you dig into the data.
So let me dig into a little bit about how it actually works. So like a lot of keyword rank trackers, you start with keywords. But one of the differences is all the different attributes that you can assign to each of your keywords.
So first is very familiar, the market or the search engine. So you want Canadian English results or Canadian French results. Any market in the world that’s available it’s pretty much available for you to use in STAT.
The second is location, which is a slightly different concept. So you can define ZIP Codes, cities, be as specific as you want. This is very important for multiple location businesses or if you’re running an advertising campaign in a certain part of the country and you want to track very specific results. But you can define location very specifically for each of your keywords.
Third is device, mobile or desktop, especially important with mobile-first indexing and increasing mobile results. But also tags, smart tags, and this is where the true power of STAT comes in, the ways that you can use smart tagging.
So you can tag your keywords in multiple ways, assigning multiple tags to slice them and dice them any way you want.
So different ways that you can tag keywords in STAT is anything that’s important to your business. For example, you can create keyword groups based on what’s important to you. On Moz, we tag keywords with “SEO” in it or anything that’s important to your business that you want to create a keyword cohort out of. Or location, like we were talking about, if you’re running an advertising campaign in Indiana and you want to tag certain keywords that you’re targeting there, something like that. Or all your Kansas city keywords or your London or Berlin keywords.
Product categories. So if you sell multiple categories, you sell TVs, books, dresses, anything you want, you might want to tag all of those into a particular keyword category. Or attributes, such as a 55-inch television versus a 48-inch television, when you want to get very, very specific across your product line.
Also your brand. At Moz, we track everything with the word “Moz” in it, or Nike or Apple or whatever your brand is or if you have multiple brands. Basically, anything that’s important to your business, any KPI that you measure, anything that’s relevant to your marketing department or finance or anything else like that, you can tag, and that’s where the true power comes in, because once you tag, you’ve created a keyword cohort or a group.
Then you can see your share of voice across that entire market using just that group. So if you want to track yourself against a very specific set of keywords, you can see your share of voice, share of voice meaning how much visibility you have in Google search results, and STAT will show you your exact competitors and how you rank among those.
Generally, you want to see yourself going up and to the right. But if you’re not, you can see exactly who’s beating you and where their movement is, and how you’re doing for that specific keyword group, which is incredibly valuable when you’re working on a particular set of keywords or a campaign.
But my favorite part — and this is where the true power comes in, because it can inform your content strategy and this is where the SEO opportunities are actually at — is the analysis of SERP features and intent. Because what STAT will do is, out of the thousands of keywords that you put into it, it will analyze the entire SERP of each of those and it will collect all the SERP features that it finds and tell you exactly what you own and don’t own and where your opportunities are.
So let’s give an example that’s a little more concrete. So let’s say you track a bunch of keywords within a particular cohort and you see that most of the results have a featured snippet. STAT will show you exactly what you own and what you don’t own. Now what’s cool about this is you can click into what you don’t own and you can see the exact featured snippets that your competitors own that you can actually create some content strategy around and try and go steal those.
A different way is images or news. So let’s say that you notice that you’re selling TVs or something like that and almost all the SERPs have images and you don’t own any of them. So something like that can inform your content strategy, where you go to your team and you say, “Hey, folks, we need to create more images, or we need better structured data to get Google to show the images because this is the intent for this type of keyword, and we’re simply not owning it in this way.”
Same thing with news. If you notice a lot of news results and you’re not a news organization but you’re competing for these keywords, that can inform your content strategy and maybe you need to go after those news keywords or try something else. Video is another one. More and more SERPs have video results with video carousel and things like that. You can see exactly what you own and what you don’t own.
A lot of times you’re going to find that certain domains are beating you on those videos and that may inform, especially for the high volume keywords that you want to go after, you may want to be creating more video content for that. But it all depends on the SERP, and you’re going to find different feature sets and different combinations for every keyword cohort that you do.
So what’s important to you and what’s important to track it’s going to show up differently every time, but it’s going to show you exactly where the opportunities are. FAQs are another thing, rich snippets sort of results. You may find that your competitors are all using FAQ markup. You’re not using any. That could inform your SEO strategy, and you might start incorporating more FAQs because Google is obviously rewarding those in the SERPs and your competitors are gaining those and not you.
Other things, virtually any SERP feature that’s trackable. You can find local results. Twitter boxes. You may find that for certain queries Google is surfacing Twitter results and maybe that means you need to be on Twitter more than you actually are right now and see who’s ranking for those results instead of something that you’re doing on-site.
Maybe it’s you need to do more YouTube. It’s not all necessarily on your site. But this will tell you where you need to invest those opportunities. Review stars, podcasts, and more. All of this will tell you what’s important and where the opportunities are and where you’re winning and losing and the exact keywords that you can go after if you want to win and the exact feature sets where your competitors are getting traffic and you aren’t.
So I use STAT, I love it, every week. It’s a great tool. If you want to try it out, I encourage you to do so. That’s it for me. Thanks, everybody.
Reblogged 2 weeks ago from feedproxy.google.com
Plus, is pinning the solution to control RSAs?
Please visit Search Engine Land for the full article.
Reblogged 2 weeks ago from feeds.searchengineland.com
Now you can push your content and HTML directly to Bing Search without the need for crawling.
Please visit Search Engine Land for the full article.
Reblogged 2 weeks ago from feeds.searchengineland.com
The post 9 Tips to Help you Post More Frequently On Your Blog appeared first on ProBlogger.
This post was first published on: Jun 17, 2008
Perhaps the fastest way to let your blog go is to stop posting to it. It can happen for many reasons whether it be your life getting busy, suffering from a bout of bloggers block or becoming distracted by another project.
Most bloggers go through one or all of these issues at one point or another and as a result posting frequency can drop, if not stall completely.
The solution is pretty obvious – if you want a vibrant blog – you do need to post to it.
While your archives might contain a lot of great content and people will find them via search engines – if you’re wanting to grow your blog you will only be able to do that if you add fresh content on a regular basis.
I know it’s not easy – all successful bloggers go through patches where it’s challenging to keep things rolling – however if you put your mind to it you can definitely get things back on track. It is never too late to get your blog rolling again!
1. Set Goals and Deadlines – If you’ve let your blog go through lack of posting – set yourself some goals this week to pull yourself out of your rut. Don’t be too ambitious – but set yourself some achievable goals to get yourself going again. Perhaps your goal will be simply to post once this next week. The week after ramp it up to twice, the week after aim for three times….
2. Try Something New – one of the tactics that I find most helpful in getting my posting frequency back up is to try something new on my blog. Whether it be tackling a topic I’ve not looked at before, starting some kind of project or competition, starting a series of posts or writing in a different style or voice – sometimes doing something ‘new’ can not only give you energy but your readers also.
3. If you’re suffering from Bloggers Block – I’ve written a series of posts on how to battle bloggers block and have compiled it all into one page here. One of the tips in that series is to try a new blogging environment. I regularly get out of the house to do some blogging (cafes are my favorite place) but really any change can be helpful. If you’re not able to actually take your computer out of the house (you don’t have a laptop) then grab a notebook and head out with that to brainstorm topics, write or do some planning of your blog.
4. If you’re simply feeling apathy towards Blogging – I’ve written a post on Declaring War on Blogger Apathy which has a few practical suggestions on how to get through that challenge. One thing that I mention in that post is to try writing shorter posts. I find that sometimes I get quite uninspired if I set myself the task to write a long post. It all can seem a little overwhelming and a little too much like hard work. So why not break down the topic into something more bite sized? I find that when I do this I can get a post written quite quickly and also find that readers sometimes appreciate something a little more focused and able to be read quickly.
5. Develop a Points System – Last year I posted about a points system that one blogger developed to keep their blogging moving along. While it might not be perfect for you – I like the idea of it and it can easily be adapted to suit your situation.
6. Set Rewards – some people respond well to rewards and incentives (I know I do). Once you’ve set some goals or made your points system – set yourself a few rewards that you’ll give yourself when you reach certain milestones.
7. Find a Blog Buddy – I mention in a couple of the links above about how I find it motivating to work with another person in my blogging. Share your goals with another blogger (or non blogger if you want) and ask them to keep you accountable to them. If you’re looking for a blog buddy you might even like to ‘advertise’ for one in the comments below this post. Pair up with another ProBlogger reader for a week or two and see what you could achieve together.
8. Repurpose Something from Your Daily Life as a Post – a few weeks ago I shared how one great way to come up with new content is to look at something you already do in your daily life and work out how to capture it and repurpose it as a post. Video yourself doing something, record a conversation, use an email interaction etc. So many things that you do each day are potential content for your blog – the key is to be aware of them and find a way to collect and use them.
9. Start Blogging
OK – I could talk about how to get yourself going for paragraphs and paragraphs – but in doing so I’m probably just distracting you from the task at hand. So now it’s time to go and do it. Step away from the distractions that might be stopping you from blogging and go and do it. Go on – I know you can!!!
The post 9 Tips to Help you Post More Frequently On Your Blog appeared first on ProBlogger.
Reblogged 2 weeks ago from feedproxy.google.com