Back to Top

The 4Es of Video: How to Align Your Marketing Content Strategy With Buyer Expectations

The expectations of today’s B2B have changed; they now seek out on-demand content to research solutions on their own before contacting your company. That’s why video is the perfect content format to reach those buyers. Here’s how. Read the full article at MarketingProfs

Reblogged 1 day ago from www.marketingprofs.com

Facebook has made it easier than ever to profit off teen girls' insecurity

Instagram drives users to compare and compete. It comes at a huge cost for some teen girls.

When Facebook spent $1 billion to buy Instagram in 2012, it sought the customers key to its continued growth: teenagers.

As adolescents and young adults fled Facebook for platforms like Instagram and Snapchat, Facebook knew its long-term survival depended on winning over that demographic. But the savvy business move had a different, less public price tag.

Caught up in recommendations from a powerful algorithm designed to keep them engaged, some teen girls found Instagram worsened their body image, according to a new Wall Street Journal investigation. Users even pinned feelings of increased depression, anxiety, and suicidal thinking on the app.

The Journal found that studies conducted privately by the platform to better understand how Instagram affects young users led to alarming results. Internal research documents from the past few years, which the Journal reviewed, revealed that a third of teen girls who already felt bad about their bodies said Instagram made them feel worse. For teens who expressed suicidal thinking, 6 percent of U.S. users and 13 percent of British users identified their experience on Instagram as a reason for those feelings.

“Comparisons on Instagram can change how young women view and describe themselves,” read one slide posted to an internal Facebook message board.


“Comparisons on Instagram can change how young women view and describe themselves.”

Evidently, Facebook, which prefers to point to its lofty ideal of doing good by connecting the world while minimizing the platform’s real and potential harms, has known since at least 2019 that its product does real damage to some young people, particularly girls. Aside from acknowledging that some users said “like” counts made them feel anxious, the company disclosed almost nothing about its research. In a statement published in the wake of the Journal‘s revelations, an Instagram executive said the company wanted to be more transparent about internal research in the future.

For years, child safety advocates and journalists, including myself, have tried to offer youth and their parents guidelines for using social media wisely, and coping skills for when things go wrong. But that approach has limits. The Journal‘s reporting makes clear that children and their caregivers are up against a ruthless business model in which Facebook, the companies that advertise on Instagram, and the influencers who stand to make a fortune from amassing impressionable followers all profit off the vulnerability and insecurity of its teen users.

What’s happening on Instagram for young girls is the age-old marketing tactic of inviting the customer to compare their life to someone else’s and compete for the better existence, but on steroids.

While there are numerous products that simultaneously trigger feelings of self-confidence and self-loathing, there is no parallel to Instagram. Fashion and beauty magazines aimed at teen girls have historically sold triumphant narratives to its readers while also peddling self-improvement through consumerism. Yet a reader cannot find her friends chatting in real-time, in ways that could include or exclude her, in those same pages. Hollywood television series and movies, which often depict unattainable looks and lifestyles for teen girls, stop after a certain length of time. Viewers don’t wait for a glamorous celebrity to speak directly to the crowd, then chime in with their own comment and wait eagerly for someone to notice.

Instagram likes to think of these dynamics as simply a reflection of our shared reality.

“Issues like negative social comparison and anxiety exist in the world, so they’re going to exist on social media too,” Karina Newton, Instagram’s head of public policy, said in the company’s statement.

Yet, Instagram has arguably changed real life itself by ratcheting up the stakes of teen girls’ digital social lives and interactions. The Journal interviewed teens who said, among other things, that Instagram intensified the feeling that high school is a popularity contest, and drew them to content that heightened negative emotions about their body.

One 19-year-old said that when she searched Instagram for workouts and found examples she liked, the algorithm kept surfacing photos of how to lose weight on her Explore page.

“I’m pounded with it every time I go on Instagram,” she told the Journal.

While every family can do its best to learn about digital safety and well-being, the truth is that those efforts are hardly a match against a company that has designed an addictive, ever-present product capable of making users feel both good and bad. The users, meanwhile, never know which experience they’ll get on any given day, or hour.

Still, teens return day after day for reasons that Facebook and Instagram cite as a defense of their product. They want to socialize with their friends. They’re participating in activism and social change. They found a community that accepts them for who they are. There may lots of benefits and no harm in these scenarios, but Facebook and Instagram haven’t been particularly interested in letting users know when the platform causes pain. In fact, it seems content to withhold its own internal findings while emphasizing the uncertainty of independent scientific research that fails to establish a causal relationship between social media use and poor well-being. (Facebook founder Mark Zuckerberg reportedly called such research contradictory.)

The evidence presented by the Journal suggests that Facebook can and will conceal its teen users’ negative experiences if they threaten the company’s bottom line. Instead, Instagram has partnered with nonprofits to create content promoting “emotional resilience.” According to the Journal, one video made as part of that project recommended teens use a daily affirmation — “I am in control of my experience on Instagram” — for a more positive experience.

The Journal‘s reporting, however, makes it obvious that users aren’t really in control. Through Instagram, Facebook has provided a platform for advertisers and influencers to leverage an algorithm to take advantage of girls’ insecurities in ways that simply weren’t possible in the past. Everyone is in it for the money — except for the girls.

If you want to talk to someone or are experiencing suicidal thoughts, Crisis Text Line provides free, confidential support 24/7. Text CRISIS to 741741 to be connected to a crisis counselor. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 8:00 p.m. ET, or email [email protected]. You can also call the National Suicide Prevention Lifeline at 1-800-273-8255. Here is a list of international resources.

Reblogged 1 day ago from feeds.mashable.com

Instagram boss says social media is like cars: People are going to die

Adam Mosseri, seen here probably not talking about lives cut short by social media.

Adam Mosseri isn’t doing Facebook any favors.

The head of Instagram was interviewed on the Recode Media podcast this week following a damning series of articles in the Wall Street Journal based on leaked internal Facebook documents. In the interview with host Peter Kafka, Mosseri attempted to defend the negative effects his platform has on its users by comparing social media to cars. The gist of his argument? Some people are just going to get run over, and that’s the price we all pay.

“We know that more people die than would otherwise because of car accidents, but by and large cars create way more value in the world than they destroy,” argued Mosseri. “And I think social media is similar.”

The Journal story in question explains how internal Facebook research (Facebook owns Instagram) found Instagram was making life worse for a segment of its users.

“We make body image issues worse for one in three teen girls,” read one 2019 internal slide obtained by the paper. “Teens blame Instagram for increases in the rate of anxiety and depression,” read another.

A fashion car wreck.

A fashion car wreck.
Credit: Arturo Holmes / getty

In response to Mosseri’s car comments, Kafka rightly pointed out that automobiles are subject to intense safety regulation on a federal level, which Mosseri countered by pivoting between saying social media regulation is welcome and, well, that it’s also potentially problematic.

“We think you have to be careful,” he said, “because regulation can cause more problems.”

Kafka was not the only one to see and highlight the inconsistency in Mosseri’s defense. Many on Twitter were quick to point out that Mosseri had come up empty when grasping at straws.

Mosseri’s analogy involving fatal car crashes may have been a little too on the nose. The Facebook research reported by the Journal found that, as the paper put it, “Among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram, one presentation showed.”

While Mosseri’s framing of social media as an ultimate societal benefit that just so happens to have some rather nasty negative externalities may come as a shock to some listeners of the Recode Media podcast, it follows in a long line of outlandish self-justification done by Facebook executives.

In 2018, BuzzFeed News published a memo written by then Facebook VP Andrew Bosworth (Bosworth has since managed to fail up to the head of Facebook’s Reality Labs, the division behind the privacy disaster in waiting that is Facebook’s camera glasses). The 2016 document painted a damning picture of a company dead set on ignoring the real-world consequences of its services.

The memo argued that Facebook’s purpose was to connect people, and sure people might die as a result, and that would be bad, but that wouldn’t slow the company down.

SEE ALSO: Facebook shut down political ad research, daring authorities to pursue regulation

“Maybe it costs a life by exposing someone to bullies,” wrote Bosworth. “Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people.”

It sounds like Mosseri and Bosworth have a lot to talk about. Too bad for the rest of us that the men’s collected influence on the lives of billions means we’ll all be forced to listen.

Reblogged 1 day ago from feeds.mashable.com

How to save videos on Snapchat

Saving a video on Snapchat is easy.

If you’ve taken a video on Snapchat and you want to save it to your camera roll, don’t fear, it’s a super easy process.

Saving a video on Snapchat is very straightforward. You can choose whether to save it to “Memories” which is Snapchat’s own camera roll and where it pulls those pesky daily flashbacks from or to both Snapchat’s “Memories” and your phone’s camera roll.

If you’re looking to save a video sent to you on Snapchat, it’s a little more complicated…and a little creepier. Maybe just don’t? Either way, we laid out a few options for you below.

Follow these steps to save videos on Snapchat.

How to save videos on Snapchat

1. Open Snapchat

2. Take video

3. Tap the downward pointing arrow on the bottom of the screen

Tap the downward pointing arrow.

Tap the downward pointing arrow.
Credit: screenshot: snapchat

4. Choose if you want to save the video just on Snapchat through “Memories” or if you want to save it to your camera roll too. Tap the option you want.

Choose where you want the video saved to.

Choose where you want the video saved to.
Credit: screenshot: snapchat

5. Select “Save”

Select "Save."

Select “Save.”
Credit: Screenshot: snapchat

When the video has saved the downward pointing arrow will become a check mark.

The checkmark indicates the video has successfully been saved.

The checkmark indicates the video has successfully been saved.
Credit: screenshot: snapchat

How to save a video on Snapchat that was sent to you

Yes, there are ways to record a Snapchat video sent to you, but we recommend that you respect that Snapchats are fleeting and let the video go. That being said, to save a video sent to you on Snapchat you can screen record it, but the user who sent you the Snap will receive a notification that you screen recorded the video.

Alternatively, you can use a sketchy third party app or mirror your phone to your computer screen and record it there. We don’t recommend either. Additionally, Snap is constantly updating and it’s very possible these methods won’t work, or the other person will be notified.

There is also the tried and true method of using another phone to record the video, but maybe just don’t!

Reblogged 1 day ago from feeds.mashable.com

The Future SEO: Boardroom edition

30-second summary:

  • SEO’s dynamic nature and Google’s mysterious algorithm specifics keep the industry on its toes
  • Is it possible to simply spot the inefficiencies of SEO in its infancy and foresee trends?
  • With over 20 years of leadership roles, SEO pioneer Kris Jones taps into his experience to help SEOs derive more strategic value

Pretty much anytime we speak about something’s future, we’re doing something called extrapolating. By definition, extrapolating involves extending existing data or trends to assume the same procedure will continue in the future. It’s a form of the scientific method that we probably use every day in our own lives, quite reasonably, too: the summers will be hot, the downtown traffic will be bad at 9 AM, and the sun will rise tomorrow morning.

But how can we look into the future of something as complex and ever-changing as SEO? As with all cases of hindsight, we are clear on how SEO began and how it has transformed over time.

We see the inefficiencies of SEO in its infancy and how advancing search engines have altered the playing field.

The catch is this: how can we surmise about the future of SEO without having access to all the mysterious algorithm specifics that Google itself holds?

The answer is simple: we have to extrapolate.

I’ve seen SEO from the boardroom perspective for more than 20 years. I’ve seen the old days of keyword stuffing to the semi-modernization of the late 2000s to the absolute beast that Google has become now, in the 2020s.

Given that, where do I think SEO is going in the not-too-distant future? Here are some thoughts on that.

User intent will remain crucial

One aspect of SEO that is essential right now and will become only more vital as time goes on is user intent in search queries.

It’s an antiquated view to think that Google still cares much about exact-match keywords. Maybe 15 to 20 years ago, getting keywords exactly right in your content was a huge deal. Google matched queries to corresponding word strings in content and then served the best of that content to a user.

Today, trying to optimize for exact-match keywords is a futile effort, as Google now understands the intent behind every query, and it’s only going to get better at it as time goes by.

If you recall Google’s BERT update from late 2019, you’ll remember that this was the change that allowed Google to comprehend the context of each search query, or the meaning behind the words themselves. And the latest Multitask Unified Model (MUM) update adds further depth and dimensions to understanding search intent.

No longer does Google look only at the words “family attractions.” It knows that that query references children’s activities, fun activities, and events that are generally lighthearted and innocent.

And all of that came from two words. How did Google do it? Its consistent algorithm updates have allowed it to think like a human.

All of this is to say that user intent has to be part of your keyword and content strategy going forward when you’re doing SEO.

Produce more evergreen content

Sometimes, over the years, I have heard people mention that devising an effective content marketing strategy is difficult because as soon as a topic’s period of relevance is over, that content will never rank again. Use your data to analyze content performance and strike the right balance between content and formats. 

If you don’t know any more about this subject, you might be tempted to believe that. Maybe, at one time, you got a content piece entitled “Top Furniture Brands of 2019” to rank for the featured snippet. That makes sense. The post was probably a long listicle that described the best brands and linked out to the manufacturers’ websites or retail stores that carried those brands.

But maybe, as spring of 2019 transitioned into fall and winter, that post fell way down the rankings and now can’t be found anywhere anymore.

The reason is obvious: you haven’t made the content evergreen. The best furniture brands of 2019 may not be the best brands of 2020 or 2021 or 2022. So, what do you do? You put the work in to make the blog post evergreen by updating it. Go through and change out the best brands, change the content, change the post’s title, and then republish the post.

You can also just plain focus on subjects that will almost never need any updating at all:

  • “Top 20 Christmas cookies to bake this year”
  • “How to train a dog”
  • “10 Steps for Hanging Heavy Objects on the Wall”

Whether it’s 2021 or 2050 or 2100, there are going to be people who have never hung a thing on a wall before and will need some help online.

Whatever your market niche is, do some topic research in Answer the Public, Semrush, or BuzzSumo to find relevant subjects for you. You can also mine the SERPs to see what kinds of content are ranking already for your desired topics. Just remember to mix in plenty of evergreen content with your more timely content posts. Google will reward you for it.

Mobile will remain first

This final point is about mobile-first indexing, but you likely already know about that. It’s certainly no secret that Google is going to rank your website’s mobile version when it crawls your pages. About 60 percent of all searches are now performed on mobile devices, and so Google now prioritizes a site’s mobile web pages over the desktop versions.

As I said, you knew all that.

What some people still may not know is that Google’s new Core Web Vitals should be a major part of your mobile page optimizations.

The Core Web Vitals are primarily a web-dev task. Overall, the three vitals work together to give users positive, seamless experiences when they access a web page.

The vitals are Cumulative Layout Shift (CLS), Largest Contentful Paint (LCP), and First Input Delay (FID).

CLS refers to the amount of moving around that a web page’s content does before it actually loads fully.

If you have a high CLS, that’s bad. It means some elements are appearing before the page loads all the way, which increases the chances of a user clicking on something that then moves elsewhere. That, in turn, means the user will probably click on something unintended.

LCP, meanwhile, is the time it takes for a page’s content to appear. It specifically refers to the amount of time between when you click on a URL and when the majority of that URL’s content appears for you to see.

Finally, FID measures how long it takes users to be able to interact with a web page in any way. These actions could be typing in a field or clicking menu items.

Even if you don’t work in web development, you can see how useful these three measures actually are. They all take user experience into account, which, coincidentally, is why they are part of Google’s larger 2021 Page Experience update.

The Core Web Vitals are essential in and of themselves, but I think my “boardroom” perspective on them is one we can all safely adopt: that they are just examples of more great things to come from Google.

The search engine giant is always thinking of new ways to make users have better, more helpful, and more positive experiences on its platform. As SEOs, we need to be ready to respond so we don’t get left in the dust.

To know the future, look to the past

We know that extrapolation can be taken only so far, but that’s why the past is so vital to understand. It can give us hints at what lies ahead.

What will Google think of next? It’s going to respond to whatever need is out there for improved online search experiences.

Think of 2020, when the pandemic was in its infancy. People needed information, and Google responded. Within months, you could tell whether restaurants were requiring masks indoors, how many virus cases were in your county, and where you could go for more information or help.

What, then, is the future of SEO? It’s going to be whatever the masses need it to become.

Kris Jones is the founder and former CEO of digital marketing and affiliate network Pepperjam, which he sold to eBay Enterprises in 2009. Most recently Kris founded SEO services and software company LSEO.com and has previously invested in numerous successful technology companies. Kris is an experienced public speaker and is the author of one of the best-selling SEO books of all time called, ‘Search-Engine Optimization – Your Visual Blueprint to Effective Internet Marketing’, which has sold nearly 100,000 copies.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

The post The Future SEO: Boardroom edition appeared first on Search Engine Watch.

Reblogged 1 day ago from www.searchenginewatch.com

Web Quality Assurance: From User Requirements To Web Risk Management

As a chemist by trade, I received a Master’s degree in Quality Management and Quality Control from Bordeaux University. My initial career was in the wine industry, ensuring the quality of the laboratory’s operations and the analyses that came out of it. As a side note, the last question of my job interview as quality assurance manager of the laboratory was “Do you like wine?, I said I didn’t. They said, “you’re hired”.

In 1999, I decided to apply my quality management insights to the web. I quit my job in the wine laboratory. I immediately started work on answering the question, “What does quality mean for a web user?” That also means answering this other question: “How can one evaluate, manage and guarantee the quality of a website?”

Quality assurance (QA) is defined as:

“A program for the systematic monitoring and evaluation of the various aspects of a project, service, or facility to ensure that standards of quality are being met.”

— “Quality assurance,” Merriam-Webster

QA is a central part of any quality management approach and all quality management is very closely linked to risk management. In most sectors where risks are understood and perceived as critical, quality assurance inevitably develops. This is why quality assurance is a pillar of the aeronautics, automotive, health, and even video game industries, and not many would dream of questioning it.

The search for the answers to questions related to quality assurance led me to create my company and to produce a few documented checklists and standards about open data, performance and web accessibility, including the two firsts versions of the French national standard on accessibility (“RGAA” which stands for Référentiel Général d’Amélioration de l’Accessibilité). It also led me to write a book about web quality assurance and the forewords of eight more books about UX, ecodesign, CSS, frontend development, amongst others. Answering these questions is also the reason why I am still passionate about web quality assurance years later. And, it is these same questions that lead me to you and your web projects. By the way, now I like wine, from everywhere.

What Does Quality Mean For The Users?

When delving into the concept of website quality assurance back in 2001, we started with a simple question: “What does quality mean for the users?”

According to the ISO (International Organization for Standardization), the term quality is:

“…the degree to which a set of inherent characteristics of an object fulfills requirements.”

Asking this initial question about a website involves analyzing the user requirements. During our research, we created a model which is comprised of five fundamental user requirements:

  1. Visibility is the ability of a site to be encountered by its potential users.
  2. Perception represents its ability to be usable and positively perceived by its users.
  3. Technical concerns its ability to function correctly.
  4. Content covers the ability to deliver quality information.
  5. Services determine its ability to offer, accompany, and/or generate quality services.

There are many user requirements that are important for the users. For instance, these five requirements don’t focus on emotions (pleasure, attachment, gratitude, and so on) but only on the success of fundamental requirements. The model doesn’t aim to identify exhaustively all user requirements. However, it can be used to classify and order them. We called it the VPTCS model.

It tells us that no matter what the product or service is, or who the users are:

Users need to be able to find the website. They need to be able to use and perceive it correctly, they need the website to function correctly, they need high-quality content, and they also need to have a good experience after their visit.

How Does Web Quality Assurance Relate To UX And UI?

To work on web quality assurance, we also had to work on another part of the quality definition: the inherent characteristics of an object. That means describing what a website is. That led us to work on the structure of the UX (user experience) and how it’s related to the UI (user interface). To do that, we also used the VPTCS model (Visibility, Perception, Technical, Content, Services).

The model reads in chronological order relating to the user’s visit to the site and the three major phases: before, during and after.

  • V: before the visit
  • PTC: during
  • S: after the visit

As you’ll see below, we’ve decided to use the VPTCS model to distinguish the total user experience (UX) from the user interface (UI). UI is covered by the three central sections of the model: Perception, Technical and Content, and is only one part of the journey.

UX starts before, and ends after, UI.

Visibility leads us to take an interest in why and how the user arrived. Visibility starts before the user encounters the interface. For instance, the way the website is described in search engine result pages or the way people talk about the website on social media; it is all part of the user experience.

On the other end of the model, the Services section leads us to have a look at what happens after the user has left the interface. For example, on an e-commerce site, your experience doesn’t end the moment you leave the site, it continues. For instance, when you can’t reach customer support or you have to wait 20 minutes to talk to a live person, when your package is delivered damaged or partially opened, or when you realize that the product description on the website wasn’t accurate. In these instances, you’re no longer using the interface itself, but interacting in a real-world user experience.

Though the VPTCS model provided us a point of view on what a website is and what the requirements of the user are, we also wanted to determine the consequences for the stakeholders of the web project i.e. those designing, producing, developing, commercializing, or marketing the website.

Which Trades Are Involved In Web Quality Assurance?

“In order to achieve high-quality user experience in a company’s offerings there must be a seamless merging of the services of multiple disciplines, including engineering, marketing, graphical and industrial design, and interface design.”

— “The Definition Of User Experience (UX),” Don Norman and Jakob Nielsen

When we started working on web quality assurance, we found that identifying user requirements (Visibility — Perception — Technical — Content — Services) wasn’t enough. To gain professional buy-in to the quality assurance approach we had to identify the different disciplines that are involved in a web project and relate them to the requirements. In mapping the different trades, you can see each and every trade is necessary and they all have at least one point in common: the user.

At this point, we had a set of tools to understand a set of requirements on the user side, and the way web trades were related to these user requirements.

Working on the concept of quality is always a multidisciplinary approach. Each user has their own subjective view about the quality of a product. Some of the users are more sensitive to technical problems, others are more preoccupied with the quality of the content, some are deeply impacted by the quality of the services. Evaluating the quality can’t be fully objective, but it’s always possible to convert general user requirements into more actionable tools. To do so, one of the simplest tools one can create is a checklist, and we did just that.

Converting User Requirements Into An Actionable Checklist

“We need a different strategy for overcoming failure, one that builds on experience and takes advantage of the knowledge people have but somehow also makes up for our inevitable human inadequacies. And there is such a strategy — though it will seem almost ridiculous in its simplicity, maybe even crazy to those of us who haven’t spent years carefully developing ever more advanced skills and technologies.

It’s a checklist.”

— Atul Gawande, The Checklist Manifesto

We decided to translate the VPTCS model into individual rules by applying the following checks:

“Are there rules that are universal, realistic, sustainable, directly verifiable by end-users that have the consensus and added value for users?”

In 2004, we submitted a set of rules to the Opquast community of web professionals in public online workshops. We gave them the following criteria for submitting rules: each rule must have a described impact on users, must be realistic, must have consensus, must be universal and verifiable by the end-user. This set of “rules to create rules” is a sanity test to keep only the rules that can be accepted and used by the community.

Since then, we have produced 4 versions of our checklist — in 2004, 2010, 2015, and 2020. In total, we have collected more than 10,000 comments, discarded more than a thousand quality rules, and kept only 240 which passed the sanity test. This checklist isn’t intended to replace other privacy, security, accessibility, SEO, or ecodesign checklists or standards. It’s only meant to list the main checkable, realistic, useful and universal, non-numeric rules that apply to a web project.

The key was and remains acceptance by all web professions and that’s also the reason why we decided to work under an open license1. We created cards that list the objectives (added value for users), how to implement the rule (implementability), and how to verify it (verifiability). As a side note, the rules can’t contain numeric figures. We learned this the hard way: after releasing the first version, one of the rules stated that the images and the homepage together couldn’t exceed 150ko. It looked realistic in 2004 but the rule was already irrelevant by 2005. We need the rules to stay relevant for at least five years, and determining numerical limits seriously hurts the consensus we aim to reach. So we added this constraint to our sanity check.

1 The rules are under an open license CC-BY-SA (Creative Commons Attribution–ShareAlike License).

The 240 rules have an impact on each and every role in a web team, from developers to customer support, from management to operations, and from UX designers to content producers. For instance, we have 35 rules out of 240 that relate to ecodesign, 23 to security, 37 to SEO, 126 to accessibility, 38 to e-commerce.

The most logical and obvious approach is to use checklists (this one or others) as conception or pre-launch tools. In our case, that means this full checklist can be used for audits, with the help of the control section of each rule. It also can be used during the conception and design process using excerpts of the checklist.

However, we found the audit or pre-launch was probably not the first thing needed to efficiently start a web quality assurance process. Before you try to comply with the rules, you need to ensure that the whole team involved in a web project understands them, even when the rules don’t seem directly related to their role on the project.

The latest version of the checklist (240 cards — 2020) in French, English and Spanish: https://checklists.opquast.com/en/web-quality-assurance/.

How To Use Web Quality Assurance Checklists?

At a first glance, the most important thing to understand in a rule is the rule itself. But maybe the reason why a rule exists is more interesting and insightful. Let’s look at an example with our Rule n°233: “The text of internal PDF documents can be selected.”

Let’s list the user contexts where compliance to this rule can be useful :

  • The content of the PDF file can be vocalized with a screen reader;
  • The content of the PDF file can be indexed on search engines;
  • The content of the PDF file can be searched;
  • The content of the PDF file can be translated;
  • The content of the PDF file can be copy-pasted.

These user cases can concern five different users:

  1. A blind user using a screen reader;
  2. A user who searches for content on a search engine;
  3. A user who searches a determined piece of content in the document;
  4. A user who doesn’t speak the language of the document and needs a translation;
  5. A user who wants to reuse a part of the content of the document.

Alternatively, it can also concern the same user who will experience the five cases above. For example, let’s imagine a Bulgarian scientist who’s blind, searching for where he is cited on the web, finds a PDF in English, then searches her/his name in the pdf file, translates it automatically to Bulgarian and finishes by copying and pasting a part of the content in his/her portfolio.

That means that with only one rule out of 240, one can identify five contexts where the rule will be useful for the users. It means that it’s a way to trigger empathy for the users who are on the other side of the screen in the diversity of their contexts.

Therefore, the first thing for a professional considering a quality rule is not how to apply the rule itself, but to understand what it is, who it is for, and why it exists. All the rules have a benefit for the users, but the reality of the web project is that professionals don’t have unlimited means. They, therefore, need to make decisions and finally, the professionals need to be able to make informed decisions on whether to apply a rule or not.

As a web professional, and despite any limited means of the web projects in which you participate, you must be able to objectively evaluate the quality of a site, to argue and explain the basis of that evaluation, to identify the risks, and arbitrate in full knowledge of the known facts.

Quality assurance needs to become a primary reflex for integrated organization-wide teams — web designers, management, sales, developers, marketing, after-sales, delivery drivers — all the people involved in the user experience.

At this point of our reflection, we have an initial set of tools to deploy web quality assurance, but that doesn’t mean we have everything that’s necessary to integrate web quality assurance and web quality management into our processes.

To go further, we need to look at the major risks of the web project.

Where Are The Major Risks Of A Web Project?

“A risk assessment is the combined effort of identifying and analyzing potential (future) events that may negatively impact individuals, assets, and/or the environment (i.e. hazard analysis); and making judgments “on the tolerability of the risk on the basis of a risk analysis” while considering influencing factors (i.e. risk evaluation).”

— “Risk assessment,” Wikipedia

Our entire industry has learned the hard way that, yes, web activities do carry abundant risks. Likewise in other industries like aeronautics, automotive or health; each risk must be classified, taking into consideration whether it’s critical or not (hazard analysis).

Appreciating a risk as critical is always partly subjective. Therefore, in the case of the web industry, I found four subjects where the risks are particularly critical. Three of these subjects (accessibility, security and privacy) have potential major consequences for the users. These consequences can also negatively affect your brand image and business. They can lead to insurmontable issues for the users, revenue losses and litigations.

The last subject I chose (ecodesign) is also critical from a systemic point of view with major potential consequences on our personal and professional lives.

There are plenty of issues that may really harm your business (poor performance, bad UX design, insufficient SEO, and so on), but in general, they won’t do as much harm as the four I identify below. The four listed subjects are by far the most critical for you, the companies and clients you work with, and, first and foremost, the users.

These four subjects and their associated risks are present in all web projects. Let’s have a look at them:

  1. Accessibility
    Is my site accessible to people with disabilities? Am I discriminating against certain people? If so, what are the risks?
    In a report published by Accessibility.com, it was estimated that 265,000 website accessibility demand letters were sent to businesses last year, resulting in U.S. companies spending perhaps billions of dollars in legal costs as a direct result of inaccessible websites in 2020 alone (Source: BOIA).
  2. Security
    Is my project endangering my organization, my colleagues, or the users? If so, what are the risks?
    In 2020, according to govtech.com, there was a 141% increase in compromised records from data breaches compared to 2019. By far the most records exposed in a single year since they have been reporting on data breach activity (www.govtech.com). They also reported that the average cost of a data breach is $3.86 million as of 2020. (Source: IBM).
  3. Privacy
    Am I putting my company’s data, my users or my employee data at risk? What are the potential consequences?
    The General Data Protection Regulation (GDPR) came into effect in May 2018. The GDPR allows the EU’s Data Protection Authorities to issue fines of up to €20 million ($24.1 million) or 4% of annual global turnover (whichever is higher). […] Penalties under the GDPR totaled €158.5 million ($191.5 million). (Source: Tessian).
  4. Ecodesign
    What is the environmental impact of my project? To what extent does my project contribute to climate change?
    The non-profit organisation The Shift Project looked at nearly 170 international studies on the environmental impact of digital technologies. According to the experts, their share of global CO2 emissions increased from 2.5 to 3.7 percent between 2013 and 2018 […] The Borderstep Institute compares various studies and comes to the conclusion that the greenhouse gas emissions caused by the production, operation and disposal of digital end devices and infrastructures are between 1.8 and 3.2 percent of global emissions (as of 2020). (Source: RESET).

We cannot afford to ignore any of the risks mentioned. Over the last ten years, these risks and their consequences have increased, causing spiraling costs, failed redesigns, lawsuits, cyber attacks, staff burnout, high turnover, environmental impact, and more. As we can see in the previous examples, all these have financial, human, social and environmental costs, all of which need to be avoided in our industry.

What we are seeing now with the web is just a very classic maturation phase of a young industry where, gradually, standards, methods, and frameworks unfold as the customers demand higher quality and providers set quality goals in order to achieve it. Disparate risks and domains like accessibility, ecodesign, performance, security, and privacy are becoming more and more structured, standardized, and subjected to national laws and regulations.

Let’s have a look at what emerged in established industries confronted with similar quality management equations to solve.

Toward Cross-Disciplinary Web Quality Management

At the turn of the ’80s, quality management professionals were mostly working on quality issues, using mainly the ISO9000 standard. Quality control, quality assurance, and quality management were the only matters I was taught around 1990. However, there were other people working on a different set of risks with standards: ISO14000 was the reference for the environment and ISO 27000 for IT security.

Compliance and deployment of these management standards were driven by distinct departments of industrial companies. At some point, because all standards were linked and probably because there were a lot of tasks and tools to mutualize, companies created HQSE (Heath Quality Security Environment) services. This kind of approach is called “integrated management systems”:

“Once upon a time, there was an H&S (health and safety) manager whose role expanded to an HSE (health, safety and environment) manager. At that same time, there was a quality manager whose roles were completely separate from the HSE supervisor. But as technology got more and more integrated into the workflow and as demand for speedy quality service and products increased, the roles have merged into one QHSE manager.”

— “Let’s Build,” Houdayfa Cherkaoui

There’s something really important to know about quality management or integrated management systems, and that is that they don’t “produce” quality, they don’t provide environmental or security compliance. They simply help the rest of the organization to control and improve these subjects. None of the people in these departments replace the experts, they just provide them with the tools, the standards, machine automation, and so on. They help everybody stay up to date and interface with the clients when a company has to prove it’ll be able to deliver a certain level of quality.

Now it’s time for me to take a bet on the future. As has happened in already established industries like aeronautics, automotive, and medicine, quality assurance has been introduced as a direct consequence of risk perception. The web teams already manage the risks separately, but while the users are concerned by all of them, we will need a cross-disciplinary approach that assembles all the subjects we have to deal with when building or maintaining a web project.

It’s too early to say exactly what will happen in detail, but what I envisage is the integration of a new layer of web QA which will assemble, sustain, and bring closer together the different web trades and realms.

What To Take Away With This Article

Along my journey (which I hope isn’t finished 🙂), I have learned quite a few things.

In order for us to deploy web quality assurance, we need to understand the user requirements and the consequences for web trades that are involved in a web project. Coming back to the VPTCS model, one of the most important things we’ve observed is that the Visibility and Services part are frequently underestimated by web teams — especially by website owners.

We have also observed that the two requirements that bring the highest value to the users are the content and services. However, it is frequently perceived that the Web professionals working in roles that fall under the Visibility, Perception and Technical categories are the most important in the web project. They can’t work without high-quality content and services (support, logistics, delivery, and so on.) which are seamlessly integrated within the web project team.

Another thing we learned is that UI is frequently perceived as a purely visual and ergonomic job. Showing that the UI is a mix of perception, technical and content helps create fewer misunderstandings between different teams working remotely. That leads us to the need for unified teams involving and making all trades work together.

There are many forms of quality assurance already in the web industry standards: regulations, unit testing, functional testing, automatic tools, manual audits, checklists, and so on. The web is seeing quality assurance increase gradually, but as far as I’m concerned, we are only at the very beginning of the road. As a start, checklists are very simple tools that can be used to reach compliance, but also to share a common culture and vocabulary.

Web teams can use checklists for compliance, but in my own experience, if you want to improve compliance and if you want web quality assurance to be sustainably deployed in your organization, it’s more efficient to first create a web quality culture with a common vocabulary and foundational framework to bootstrap from.

“63% of people who are in a digital transformation process say that culture is the number one barrier….56% stated cross-department collaboration as their 3rd largest challenge.”

Altimeter & Capgemini study

The goal is to create a cultural foundation of sharing risks — like the one I mentioned and all the others — and responsibilities orientated to the users. A global set of rules is one of the solutions you can use to empower web teams and to create a global culture and vocabulary, but It also needs to be accompanied by a global mutualized risk management system for the web projects. This management system needs to take care of a global set of rules, standards and tools being able to call specialized experts for complex problems.

Web quality assurance can contribute towards there being more responsible professionals, trained and empowered as quality custodians to represent the users, the customers, and the citizens’ best interests.

The journey goes on.

Further Reading

Reblogged 1 day ago from smashingmagazine.com

More Google Ads campaign changes this month; Thursday’s daily brief

Plus, Reddit makes good on its advertising features promise

Please visit Search Engine Land for the full article.

Reblogged 1 day ago from feeds.searchengineland.com

How to test your content site strategy for continued improvement

Forecasting, documenting and analyzing your content initiatives can help grow your brand’s visibility and your skills as an SEO.

Please visit Search Engine Land for the full article.

Reblogged 1 day ago from feeds.searchengineland.com

How Our Website Conversion Strategy Increased Business Inquiries by 37%

Having a website that doesn’t convert is a little like having a bucket with a hole in it. Do you keep filling it up while the water’s pouring out — or do you fix the hole then add water? In other words, do you channel your budget into attracting people who are “pouring” through without taking action, or do you fine-tune your website so it’s appealing enough for them to stick around?

Our recommendation? Optimize the conversion rate of your website, before you spend on increasing your traffic to it.

Here’s a web design statistic to bear in mind: you have 50 milliseconds to make a good first impression. If your site’s too slow, or unattractive, or the wording isn’t clear, they’ll bounce faster than you can say “leaky bucket”. Which is a shame, because you’ve put lots of effort into designing a beautiful product page and About Us, and people just aren’t getting to see it.

As a digital web design and conversion agency in Melbourne, Australia, we’ve been helping our customers optimize their websites for over 10 years, but it wasn’t until mid-2019 that we decided to turn the tables and take a look at our own site.

As it turned out, we had a bit of a leaky bucket situation of our own: while our traffic was good and conversions were okay, there was definitely room for improvement.

In this article, I’m going to talk a little more about conversions: what they are, why they matter, and how they help your business. I’ll then share how I made lots of little tweaks that cumulatively led to my business attracting a higher tier of customers, more inquiries, plus over $780,000 worth of new sales opportunities within the first 26 weeks of making some of those changes. Let’s get into it!

What is conversion?

Your conversion rate is a figure that represents the percentage of visitors who come to your site and take the desired action, e.g. subscribing to your newsletter, booking a demo, purchasing a product, and so on.

Conversions come in all shapes and sizes, depending on what your website does. If you sell a product, making a sale would be your primary goal (aka a macro-conversion). If you run, say, a tour company or media outlet, then subscribing or booking a consultation might be your primary goal.

If your visitor isn’t quite ready to make a purchase or book a consultation, they might take an intermediary step — like signing up to your free newsletter, or following you on social media. This is what’s known as a micro-conversion: a little step that leads towards (hopefully) a bigger one.

A quick recap

A conversion can apply to any number of actions — from making a purchase, to following on social media.

Macro-conversions are those we usually associate with sales: a phone call, an email, or a trip to the checkout. These happen when the customer has done their research and is ready to leap in with a purchase. If you picture the classic conversion funnel, they’re already at the bottom.

Micro-conversions, on the other hand, are small steps that lead toward a sale. They’re not the ultimate win, but they’re a step in the right direction.

Most sites and apps have multiple conversion goals, each with its own conversion rate.

Micro-conversions vs. macro-conversions: which is better?

The short answer? Both. Ideally, you want micro- and macro-conversions to be happening all the time so you have a continual flow of customers working their way through your sales funnel. If you have neither, then your website is behaving like a leaky bucket.

Here are two common issues that seem like good things, but ultimately lead to problems:

  1. High web traffic (good thing) but no micro- or macro-conversions (bad thing — leaky bucket alert)

  2. High web traffic (good thing) plenty of micro-conversions (good thing), but no macro conversions (bad thing)

A lot of businesses spend heaps of money making sure their employees work efficiently, but less of the budget goes into what is actually one of your best marketing tools: your website.

Spending money on marketing will always be a good thing. Getting customers to your site means more eyes on your business — but when your website doesn’t convert visitors into sales, that’s when you’re wasting your marketing dollars. When it comes to conversion rate statistics, one of the biggest eye-openers I read was this: the average user’s attention span has dropped from 12 to a mere 7 seconds. That’s how long you’ve got to impress before they bail — so you’d better make sure your website is fast, clear, and attractive.

Our problem

Our phone wasn’t ringing as much as we’d have liked, despite spending plenty of dollars on SEO and Adwords. We looked into our analytics and realized traffic wasn’t an issue: a decent number of people were visiting our site, but too few were taking action — i.e. inquiring. Here’s where some of our issues lay:

  • Our site wasn’t as fast as it could have been (anything with a load time of two seconds or over is considered slow. Ours was hovering around 5-6, and that was having a negative impact on conversions).

  • Our CTA conversions were low (people weren’t clicking — or they were dropping off because the CTA wasn’t where it needed to be).

  • We were relying on guesswork for some of our design decisions — which meant we had no way of measuring what worked, and what didn’t.

  • In general, things were good but not great. Or in other words, there was room for improvement.

What we did to fix it

Improving your site’s conversions isn’t a one-size-fits all thing — which means what works for one person might not work for you. It’s a gradual journey of trying different things out and building up successes over time. We knew this having worked on hundreds of client websites over the years, so we went into our own redesign with this in mind. Here are some of the steps we took that had an impact.

We decided to improve our site

First of all, we decided to fix our company website. This sounds like an obvious one, but how many times have you thought “I’ll do this really important thing”, then never gotten round to it. Or rushed ahead in excitement, made a few tweaks yourself, then let your efforts grind to a halt because other things took precedence?

This is an all-too-common problem when you run a business and things are just… okay. Often there’s no real drive to fix things and we fall back into doing what seems more pressing: selling, talking to customers, and running the business.

Deciding you want to improve your site’s conversions starts with a decision that involves you and everyone else in the company, and that’s what we did. We got the design and analytics experts involved. We invested time and money into the project, which made it feel substantial. We even made EDMs to announce the site launch (like the one below) to let everyone know what we’d been up to. In short, we made it feel like an event.

We got to know our users

There are many different types of user: some are ready to buy, some are just doing some window shopping. Knowing what type of person visits your site will help you create something that caters to their needs.

We looked at our analytics data and discovered visitors to our site were a bit of both, but tended to be more ready to buy than not. This meant we needed to focus on getting macro-conversions — in other words, make our site geared towards sales — while not overlooking the visitors doing some initial research. For those users, we implemented a blog as a way to improve our SEO, educate leads, and build up our reputation.

User insight can also help you shape the feel of your site. We discovered that the marketing managers we were targeting at the time were predominantly women, and that certain images and colours resonated better among that specific demographic. We didn’t go for the (obvious pictures of the team or our offices), instead relying on data and the psychology of attraction to delve into the mind of the users.

We improved site speed

Sending visitors to good sites with bad speeds erodes trust and sends them running. Multiple studies show that site speed matters when it comes to conversion rates. It’s one of the top SEO ranking factors, and a big factor when it comes to user experience: pages that load in under a second convert around 2.5 times higher than pages taking five seconds or more.

We built our website for speed. Moz has a great guide on page speed best practices, and from that list, we did the following things:

  • We optimized images.

  • We managed our own caching.

  • We compressed our files.

  • We improved page load times (Moz has another great article about how to speed up time to first Byte). A good web page load time is considered to be anything under two seconds — which we achieved.

  • In addition, we also customized our own hosting to make our site faster.

We introduced more tracking

As well as making our site faster, we introduced a lot more tracking. That allowed us to refine our content, our messaging, the structure of the site, and so on, which continually adds to the conversion.

We used Google Optimize to run A/B tests across a variety of things to understand how people interacted with our site. Here are some of the tweaks we made that had a positive impact:

  • Social proofing can be a really effective tool if used correctly, so we added some stats to our landing page copy.

  • Google Analytics showed us visitors were reaching certain pages and not knowing quite where to go next, so we added CTAs that used active language. So instead of saying, “If you’d like to find out more, let us know”, we said “Get a quote”, along with two options for getting in touch.

  • We spent an entire month testing four words on our homepage. We actually failed (the words didn’t have a positive impact), but it allowed us to test our hypothesis. We did small tweaks and tests like this all over the site.

  • We used heat mapping to see where visitors were clicking, and which words caught their eye. With this data, we knew where to place buttons and key messaging.

We looked into user behavior

Understanding your visitor is always a good place to start, and there are two ways to go about this:

  1. Quantitative research (numbers and data-based research)

  2. Qualitative research (people-based research)

We did a mixture of both.

For the quantitative research, we used Google Analytics, Google Optimize, and Hotjar to get an in-depth, numbers-based look at how people were interacting with our site.

Heat-mapping software shows how people click and scroll through a page. Hot spots indicate places where people naturally gravitate.

We could see where people were coming into our site (which pages they landed on first), what channel brought them there, which features they were engaging with, how long they spent on each page, and where they abandoned the site.

For the qualitative research, we focused primarily on interviews.

  • We asked customers what they thought about certain CTAs (whether they worked or not, and why).

  • We made messaging changes and asked customers and suppliers whether they made sense.

  • We invited a psychologist into the office and asked them what they thought about our design.

What we learned

We found out our design was good, but our CTAs weren’t quite hitting the mark. For example, one CTA only gave the reader the option to call. But, as one of our interviewees pointed out, not everyone likes using the phone — so we added an email address.

We were intentional but ad hoc about our asking process. This worked for us — but you might want to be a bit more formal about your approach (Moz has a great practical guide to conducting qualitative usability testing if you’re after a more in-depth look).

The results

Combined, these minor tweaks had a mighty impact. There’s a big difference in how our site looks and how we rank. The bottom line: after the rebuild, we got more work, and the business did much better. Here are some of the gains we’ve seen over the past two years.

  • Our site speed increased: we managed to achieve a load time of around 500-600 ms.

  • Our dwell time increased by 73%, going from 1.5 to 2.5 minutes.

  • We received four-times more inquiries by email and phone.

  • Our organic traffic increased despite us not channeling more funds into PPC ads.

  • We also realized our clients were bigger, paying on average 2.5 times more for jobs: in mid-2018, our average cost-per-job was $8,000. Now, it’s $17,000.

  • Our client brand names became more recognizable, household names — including two of Australia’s top universities, and a well-known manufacturing/production brand.

  • Within the first 26 weeks, we got over $770,000 worth of sales opportunities (if we’d accepted every job that came our way).

  • Our prospects began asking to work with us, rather than us having to persuade them to give us the business.

  • We started getting higher quality inquiries — warmer leads who had more intent to buy.

Some practical changes you can make to improve your website conversions

When it comes to website changes, it’s important to remember that what works for one person might not work for you.

We’ve used site speed boosters for our clients before and gotten really great results. At other times, we’ve tried it and it just broke the website. This is why it’s so important to measure as you go, use what works for your individual needs, and remember that “failures” are just as helpful as wins.

Below are some tips — some of which we did on our own site, others are things we’ve done for others.

Tip number 1: Get stronger hosting that allows you to consider things like CDNs. Hiring a developer should always be your top choice, but it’s not always possible to have that luxury. In this instance, we recommend considering CDNs, and depending on the build of your site, paying for tools like NitroPack which can help with caching and compression for faster site speeds.

Tip number 2: Focus your time. Identify top landing pages with Moz Pro and channel your efforts in these places as a priority. Use the 80/20 principle and put your attention on the 20% that gets you 80% of your success.

Tip number 3: Run A/B tests using Google Optimize to test various hypotheses and ideas (Moz has a really handy guide for running split tests using Google). Don’t be afraid of the results — failures can help confirm that what you are currently doing right. You can also access some in-depth data about your site’s performance in Google Lighthouse.

Tip number 4: Trial various messages in Google Ads (as a way of testing targeted messaging). Google provides many keyword suggestions on trending words and phrases that are worth considering.

Tip number 5: Combine qualitative and quantitative research to get to know how your users interact with your site — and keep testing on an ongoing basis.

Tip number 6: Don’t get too hung up on charts going up, or figures turning orange: do what works for you. If adding a video to your homepage slows it down a little but has an overall positive effect on your conversion, then it’s worth the tradeoff.

Tip number 7: Prioritize the needs of your target customers and focus every build and design choice around them.

Recommended tools

  • Nitropack: speed up your site if you’ve not built it for speed from the beginning.

  • Google Optimize: run A/B tests

  • HotJar: see how people use your site via heat mapping and behaviour analytics.

  • Pingdom / GTMetrix: measure site speed (both is better if you want to make sure you meet everyone’s requirements).

  • Google Analytics: find drop-off points, track conversion, A/B test, set goals.

  • Qualaroo: poll your visitors while they are on your site with a popup window.

  • Google Consumer Surveys: create a survey, Google recruits the participants and provides results and analysis.

  • Moz Pro: Identify top landing pages when you connect this tool to your Google Analytics profile to create custom reports.

How to keep your conversion rates high

Treat your website like your car. Regular little tweaks to keep it purring, occasional deeper inspections to make sure there are no problems lurking just out of sight. Here’s what we do:

  • We look at Google Analytics monthly. It helps to understand what’s working, and what’s not.

  • We use goal tracking in GA to keep things moving in the right direction.

  • We use Pingdom‘s free service to monitor the availability and response time of our site.

  • We regularly ask people what they think about the site and its messaging (keeping the qualitative research coming in).

Conclusion

Spending money on marketing is a good thing, but when you don’t have a good conversion rate, that’s when your website’s behaving like a leaky bucket. Your website is one of your strongest sales tools, so it really does pay to make sure it’s working at peak performance.

I’ve shared a few of my favorite tools and techniques, but above all, my one bit of advice is to consider your own requirements. You can improve your site speed if you remove all tags and keep it plain. But that’s not what you want: it’s finding the balance between creativity and performance, and that will always depend on what’s important.

For us as a design agency, we need a site that’s beautiful and creative. Yes, having a moving background on our homepage slows it down a little bit, but it improves our conversions overall.

The bottom line: Consider your unique users, and make sure your website is in line with the goals of whoever you’re speaking with.

We can do all we want to please Google, but when it comes to sales and leads, it means more to have a higher converting and more effective website. We did well in inquiries (actual phone calls and email leads) despite a rapid increase in site performance requirements from Google. This only comes down to one thing: having a site customer conversion framework that’s effective.

Reblogged 2 days ago from feedproxy.google.com

New technology from Apple & others help me plug in; Wednesday’s daily brief

Plus more on Google search-related changes that you may have missed.

Please visit Search Engine Land for the full article.

Reblogged 2 days ago from feeds.searchengineland.com