terça-feira, 29 de novembro de 2022

Local Pack Header Specificity Vanishes while Local Packs Downtrend

In July of this year, Dr. Peter J. Meyers and I published a report analyzing an element of Google’s local results we termed “local pack headers”. About a month after publication, members of the local SEO community, like Colan Nielsen, began noticing that the extraordinary diversity of headings we had captured had suddenly diminished:

Today, I’m doing a quick follow-up to the manual portion of our earlier study in an effort to quantify and illustrate this abrupt alteration.

A total sea change in local pack headers

Between July and November of 2022, 83% of our previously-queried local pack headers underwent a complete transformation of nomenclature. Only 17% of the local pack headers were still worded the same way in autumn as they had been in the summertime. Here is a small set of examples:

In our manual analysis of 60 queries in July, we encountered 40 unique local pack headers - a tremendous variety. Now, all specificity is gone. For all of our queries, headings have been reduced to just 3 types: in-store availability, places, and businesses.

Entity relationships remain mysterious

What hasn’t changed is my sense that the logic underpinning which businesses receive which local pack header remains rather odd. In the original study, we noted the mystery of why a query like “karate” fell under the heading of “martial arts school” but a query for “tai chi” got a unique “tai chi heading”, or why “adopt dog” results were headed “animal rescue services” but “adopt bunny” got a pack labeled “adopt bunny”. The curious entity relationships continue on, even in this new, genericized local pack header scenario. For example, why is my search for “tacos” (which formerly brought up a pack labeled “Mexican restaurants”, now labeled this:

But my search for “oil change” gets this header:

Is there something about a Mexican restaurant that makes it more of a “place” and an oil change spot that makes it more of a “business”? I don’t follow the logic. Meanwhile, why are service area businesses, as shown in my search for “high weed mowing” being labeled “places”?

Surely high weed mowing is not a place…unless it is a philosophical one. Yet I saw many SABs labeled this way instead of as “businesses”, which would seem a more rational label, given Google’s historic distinction between physical premises and go-to-client models. There are many instances like this of the labeling not making much horse sense, and with the new absence of more specific wording, it feels like local pack headers are likely to convey less meaning and be more easily overlooked now.

Why has Google done this and does it matter to your local search marketing?

Clearly, Google decided to streamline their classifications. There may be more than three total local pack header types, but I have yet to see them. Hotel packs continue to have their own headings, but they have always been a different animal:

In general, Google experiments with whatever they think will move users about within their system, and perhaps they felt the varied local pack headers were more of a distraction than an aid to interactivity with the local packs. We can’t know for sure, nor can we say how long this change will remain in place, because Google could bring back the diverse headings the day after I publish this column!

As to whether this matters to your local search campaigns, unfortunately, the generic headers do obscure former clues to the mind of Google that might have been useful in your SEO. I previously suggested that local businesses might want to incorporate the varied local pack terms into the optimization of the website tags and text, but in the new scenario, it is likely to be pointless to optimize anything for “places”, “businesses”, or “in-store availability”. It’s a given that your company is some kind of place or business if you’re creating a Google Business Profile for it. And, your best bet for featuring that you carry certain products is to publish them on your listing and consider whether you want to opt into programs like Pointy.

In sum, this change is not a huge deal, but I’m a bit sorry to see the little clues of the diversified headers vanish from sight. Meanwhile, there’s another local pack trend going on right now that you should definitely be paying attention to…

A precipitous drop in overall local pack presence

In our original study, Google did not return a local pack for 18% of our manual July queries. By November, the picture had significantly changed. A startling 42% of our queries suddenly no longer displayed a local pack. This is right in line with Andrew Shotland’s documentation of a 42.3% drop from peak local pack display between August and October. Mozcast, pictured above, captured a drop from 39.6% of queries returning local packs on October 24th to just 25.1% on October 25th. The number has remained in the low-to-mid 20s in the ensuing weeks. It’s enough of a downward slope to give one pause.

Because I’m convinced of the need for economic localism as critical to healing the climate and society, I would personally like Google to return local packs for all commercial queries so that searchers can always see the nearest resource for purchasing whatever they need, but if Google is reducing the number of queries for which they deliver local results, I have to try to understand their thinking.

To do that, I have to remember that the presence of a local pack is a signal that Google believes a query has a local intent. Likely, they often get this right, but I can think of times when a local result has appeared for a search term that doesn’t seem to me to be obviously, inherently local. For example, in the study Dr. Pete and I conducted, we saw Google not just returning a local pack for the keyword “pickles” but even giving it its own local pack header:

If I search for pickles, am I definitely looking for pickles near me, or could I be looking for recipes, articles about the nutritional value of pickles, the history of pickles, something else? How high is Google’s confidence that vague searches like these should be fulfilled with a local result?

After looking at a number of searches like these in the context of intent, my current thinking is this: for some reason unknown to us, Google is dialing back presumed local intent. Ever since Google made the user the centroid of search and began showing us nearby results almost by default for countless queries, we users became trained not to have to add many (or any) modifiers to our search language to prompt Google to lay out our local options for us. We could be quite lazy in our searches and still get local results.

In the new context of a reduced number of searches generating local packs, though, we will have to rehabituate ourselves to writing more detailed queries to get to what we want if Google no longer thinks our simple search for “pickles” implies “pickles near me”. I almost get the feeling that Google wants us to start being more specific again because its confidence level about what constitutes a local search has suffered some kind of unknown challenge.

It’s also worth throwing into our thinking what our friends over at NearMedia.co have pointed out:

“The Local Pack's future is unclear. EU's no "self-preferencing" DMA takes effect in 2023. The pending AICOA has a similar language.”

It could be that Google’s confidence is being shaken in a variety of ways, including by regulatory rulings, and local SEOs should always expect change. For now, though, local businesses may be experiencing some drop in their local pack traffic and CTR. On the other hand, if Google is getting it right, there may be no significant loss. If your business was formerly showing up in a local pack for a query that didn’t actually have a local intent, you likely weren’t getting those clicks anyway because a local result wasn’t what the searcher was looking for to begin with.

That being said, I am seeing examples in which I feel Google is definitely getting it wrong. For instance, my former searches for articles of furniture all brought up local packs with headings like “accent chairs” or “lamps”. Now, Google is returning no local pack for some of these searches and is instead plugging an enormous display of remote, corporate shopping options. There are still furniture stores near me, but Google is now hiding them, and that disappoints me greatly:


So here’s today’s word to the wise: keep working on the organic optimization of your website and the publication of helpful content. Both will underpin your key local pack rankings, and as we learned from our recent large-scale local business review survey, 51% of consumers are going to end up on your site as their next step after reading reviews on your listings. 2023 will be a good year to invest in the warm and inclusive welcome your site is offering people, and the investment will also stand you in good stead however local pack elements like headers, or even local packs, themselves, wax and wane.

segunda-feira, 28 de novembro de 2022

4 Common Mistakes E-commerce Websites Make Using JavaScript

Despite the resources they can invest in web development, large e-commerce websites still struggle with SEO-friendly ways of using JavaScript.

And, even when 98% of all websites use JavaScript, it’s still common that Google has problems indexing pages using JavaScript. While it's okay to use it on your website in general, remember that JavaScript requires extra computing resources to be processed into HTML code understandable by bots.

At the same time, new JavaScript frameworks and technologies are constantly arising. To give your JavaScript pages the best chance of indexing, you'll need to learn how to optimize it for the sake of your website's visibility in the SERPs.

Why is unoptimized JavaScript dangerous for your e-commerce?

By leaving JavaScript unoptimized, you risk your content not getting crawled and indexed by Google. And in the e-commerce industry, that translates to losing significant revenue, because products are impossible to find via search engines.

It’s likely that your e-commerce website uses dynamic elements that are pleasant for users, such as product carousels or tabbed product descriptions. This JavaScript-generated content very often is not accessible to bots. Googlebot cannot click or scroll, so it may not access all those dynamic elements.

Consider how many of your e-commerce website users visit the site via mobile devices. JavaScript is slower to load so, the longer it takes to load, the worse your website’s performance and user experience becomes. If Google realizes that it takes too long to load JavaScript resources, it may skip them when rendering your website in the future.

Top 4 JavaScript SEO mistakes on e-commerce websites

Now, let’s look at some top mistakes when using JavaScript for e-commerce, and examples of websites that avoid them.

1. Page navigation relying on JavaScript

Crawlers don’t act the same way users do on a website ‒ they can’t scroll or click to see your products. Bots must follow links throughout your website structure to understand and access all your important pages fully. Otherwise, using only JavaScript-based navigation may make bots see products just on the first page of pagination.

Guilty: Nike.com

Nike.com uses infinite scrolling to load more products on its category pages. And because of that, Nike risks its loaded content not getting indexed.

For the sake of testing, I entered one of their category pages and scrolled down to choose a product triggered by scrolling. Then, I used the “site:” command to check if the URL is indexed in Google. And as you can see on a screenshot below, this URL is impossible to find on Google:

Of course, Google can still reach your products through sitemaps. However, finding your content in any other way than through links makes it harder for Googlebot to understand your site structure and dependencies between the pages.

To make it even more apparent to you, think about all the products that are visible only when you scroll for them on Nike.com. If there’s no link for bots to follow, they will see only 24 products on a given category page. Of course, for the sake of users, Nike can’t serve all of its products on one viewport. But still, there are better ways of optimizing infinite scrolling to be both comfortable for users and accessible for bots.

Winner: Douglas.de

Unlike Nike, Douglas.de uses a more SEO-friendly way of serving its content on category pages.

They provide bots with page navigation based on <a href> links to enable crawling and indexing of the next paginated pages. As you can see in the source code below, there’s a link to the second page of pagination included:

Moreover, the paginated navigation may be even more user-friendly than infinite scrolling. The numbered list of category pages may be easier to follow and navigate, especially on large e-commerce websites. Just think how long the viewport would be on Douglas.de if they used infinite scrolling on the page below:

2. Generating links to product carousels with JavaScript

Product carousels with related items are one of the essential e-commerce website features, and they are equally important from both the user and business perspectives. Using them can help businesses increase their revenue as they serve related products that users may be potentially interested in. But if those sections over-rely on JavaScript, they may lead to crawling and indexing issues.

Guilty: Otto.de

I analyzed one of Otto.de’s product pages to identify if it includes JavaScript-generated elements. I used the What Would JavaScript Do (WWJD) tool that shows screenshots of what a page looks like with JavaScript enabled and disabled.

Test results clearly show that Otto.de relies on JavaScript to serve related and recommended product carousels on its website. And from the screenshot below, it’s clear that those sections are invisible with JavaScript disabled:

How may it affect the website’s indexing? When Googlebot lacks resources to render JavaScript-injected links, the product carousels can’t be found and then indexed.

Let’s check if that’s the case here. Again, I used the “site:” command and typed the title of one of Otto.de’s product carousels:

As you can see, Google couldn’t find that product carousel in its index. And the fact that Google can’t see that element means that accessing additional products will be more complex. Also, if you prevent crawlers from reaching your product carousels, you’ll make it more difficult for them to understand the relationship between your pages.

Winner: Target.com

In the case of Target.com’s product page, I used the Quick JavaScript Switcher extension to disable all JavaScript-generated elements. I paid particular attention to the “More to consider” and “Similar items” carousels and how they look with JavaScript enabled and disabled.

As shown below, disabling JavaScript changed the way the product carousels look for users. But has anything changed from the bots' perspective?

To find out, check what the HTML version of the page looks like for bots by analyzing the cache version.

To check the cache version of Target.com’s page above, I typed “cache:https://www.target.com/p/9-39-...”, which is the URL address of the analyzed page. Also, I took a look at the text-only version of the page.

When scrolling, you’ll see that the links to related products can also be found in its cache. If you see them here, it means bots don’t struggle to find them, either.

However, keep in mind that the links to the exact products you can see in the cache may differ from the ones on the live version of the page. It’s normal for the products in the carousels to rotate, so you don’t need to worry about discrepancies in specific links.

But what exactly does Target.com do differently? They take advantage of dynamic rendering. They serve the initial HTML, and the links to products in the carousels as the static HTML bots can process.

However, you must remember that dynamic rendering adds an extra layer of complexity that may quickly get out of hand with a large website. I recently wrote an article about dynamic rendering that’s a must-read if you are considering this solution.

Also, the fact that crawlers can access the product carousels doesn’t guarantee these products will get indexed. However, it will significantly help them flow through the site structure and understand the dependencies between your pages.

3. Blocking important JavaScript files in robots.txt

Blocking JavaScript for crawlers in robots.txt by mistake may lead to severe indexing issues. If Google can’t access and process your important resources, how is it supposed to index your content?

Guilty: Jdl-brakes.com

It’s impossible to fully evaluate a website without a proper site crawl. But looking at its robots.txt file can already allow you to identify any critical content that’s blocked.

This is the case with the robots.txt file of Jdl-brakes.com. As you can see below, they block the /js/ path with the Disallow directive. It makes all internally hosted JavaScript files (or at least the important ones) invisible to all search engine bots.

This disallow directive misuse may result in rendering problems on your entire website.

To check if it applies in this case, I used Google’s Mobile-Friendly Test. This tool can help you navigate rendering issues by giving you insight into the rendered source code and the screenshot of a rendered page on mobile.

I headed to the “More info” section to check if any page resources couldn’t be loaded. Using the example of one of the product pages on Jdl-brakes.com, you may see it needs a specific JavaScript file to get fully rendered. Unfortunately, it can’t happen because the whole /js/ folder is blocked in its robots.txt.

But let’s find out if those rendering problems affected the website’s indexing. I used the “site:” command to check if the main content (product description) of the analyzed page is indexed on Google. As you can see, no results were found:

This is an interesting case where Google could reach the website's main content but didn’t index it. Why? Because Jdl-brakes.com blocks its JavaScript, Google can’t properly see the layout of the page. And even though crawlers can access the main content, it’s impossible for them to understand where that content belongs in the page’s layout.

Let’s take a look at the Screenshot tab in the Mobile-Friendly Test. This is how crawlers see the page’s layout when Jdl-brakes.com blocks their access to CSS and JavaScript resources. It looks pretty different from what you can see in your browser, right?

The layout is essential for Google to understand the context of your page. If you’d like to know more about this crossroads of web technology and layout, I highly recommend looking into a new field of technical SEO called rendering SEO.

Winner: Lidl.de

Lidl.de proves that a well-organized robots.txt file can help you control your website’s crawling. The crucial thing is to use the disallow directive consciously.

Although Lidl.de blocks a single JavaScript file with the Disallow directive /cc.js*, it seems it doesn’t affect the website’s rendering process. The important thing to note here is that they block only a single JavaScript file that doesn’t influence other URL paths on a website. As a result, all other JavaScript and CSS resources they use should remain accessible to crawlers.

Having a large e-commerce website, you may easily lose track of all the added directives. Always include as many path fragments of a URL you want to block from crawling as possible. It will help you avoid blocking some crucial pages by mistake.

4. JavaScript removing main content from a website

If you use unoptimized JavaScript to serve the main content on your website, such as product descriptions, you block crawlers from seeing the most important information on your pages. As a result, your potential customers looking for specific details about your products may not find such content on Google.

Guilty: Walmart.com

Using the Quick JavaScript Switcher extension, you can easily disable all JavaScript-generated elements on a page. That’s what I did in the case of one of Walmart.com’s product pages:

As you can see above, the product description section disappeared with JavaScript disabled. I decided to use the “site:” command to check if Google could index this content. I copied the fragment of the product description I saw on the page with JavaScript enabled. However, Google didn’t show the exact product page I was looking for.

Will users get obsessed with finding that particular product via Walmart.com? They may, but they can also head to any other store selling this item instead.

The example of Walmart.com proves that main content depending on JavaScript to load makes it more difficult for crawlers to find and display your valuable information. However, it doesn’t necessarily mean they should eliminate all JavaScript-generated elements on their website.

To fix this problem, Walmart has two solutions:

  1. Implementing dynamic rendering (prerendering) which is, in most cases, the easiest from an implementation standpoint.

  2. Implementing server-side rendering. This is the solution that will solve the problems we are observing at Walmart.com without serving different content to Google and users (as in the case of dynamic rendering). In most cases, server-side rendering also helps with web performance issues on lower-end devices, as all of your JavaScript is being rendered by your servers before it reaches the client's device.

Let’s have a look at the JavaScript implementation that’s done right.

Winner: IKEA.com

IKEA proves that you can present your main content in a way that is accessible for bots and interactive for users.

When browsing IKEA.com’s product pages, their product descriptions are served behind clickable panels. When you click on them, they dynamically appear on the right-hand side of the viewport.

Although users need to click to see product details, Ikea also serves that crucial part of its pages even with JavaScript off:

This way of presenting crucial content should make both users and bots happy. From the crawlers’ perspective, serving product descriptions that don’t rely on JavaScript makes them easy to access. Consequently, the content can be found on Google.

Wrapping up

JavaScript doesn’t have to cause issues, if you know how to use it properly. As an absolute must-do, you need to follow the best practices of indexing. It may allow you to avoid basic JavaScript SEO mistakes that can significantly hinder your website’s visibility on Google.

Take care of your indexing pipeline and check if:

  • You allow Google access to your JavaScript resources,

  • Google can access and render your JavaScript-generated content. Focus on the crucial elements of your e-commerce site, such as product carousels or product descriptions,

  • Your content actually gets indexed on Google.

If my article got you interested in JS SEO, find more details in Tomek Rudzki’s article about the 6 steps to diagnose and solve JavaScript SEO issues.

sexta-feira, 25 de novembro de 2022

3 Important Google Updates to Understand — Whiteboard Friday

With recent shake-ups to the Google algorithm, Lily Ray joins us for this week’s episode to walk you three of the most important types of search engine updates that can affect your SEO strategies.

whiteboard outlining three important google algorithm updates to understand

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, everyone. I'm Lily Ray, and today I'm going to be talking about a few different types of Google updates. 

Helpful content update

So we're going to start with the helpful content update, which was announced and rolled out in August of 2022. The helpful content update introduced a new concept in the SEO space, which is a sitewide classifier that will be identifying unhelpful content. This is content that Google decided or determined is primarily written for search engines and is not exactly helpful for users.

This new sitewide classifier that they're using with the helpful content update will be applied to sites that Google believes is doing this type of thing at scale. So if the majority of the content on the website is considered unhelpful, it's written primarily for search engines and not for users, the helpful content update classifier can be applied to those sites, which can have the impact of affecting the rankings for the whole site. So it's not just one or two pages. It's potentially the entire site, including even if the website has some content that's actually helpful.

So this was introduced in about mid-2022, and Google has explained that it's going to be using machine learning with the helpful content update classifier, which means that the classifier is learning and growing and evolving over time. As it begins to understand different patterns and different signals about sites that do provide unhelpful content, it can continue to impact those sites over time.

So while they told us the update rolled out in August, and it lasted about two weeks, and it concluded, we also know that Google will likely be leveraging the helpful content update classifier all the time or in future updates. They told us if it's a big, significant change to how they use this update, they'll let us know, but otherwise, we should assume that it's kind of there operating in the background. So this was a new development for 2022.

Product reviews update

The product reviews update, there have been a variety of them. They started in 2021, and this was also a new type of update from Google in which Google is telling us that if you're a website that provides product review content, you need to meet a certain criteria for content quality that they're looking for. The backstory behind this and the reason that I believe Google rolled out these product review updates is because there are many, many websites that have reviews of products, that have affiliate links, that are making money through SEO, through having these affiliate websites, but they don't add a lot of value. They don't tell you a lot of insights about the product that's different than what Google has already seen before. Google has received a lot of feedback from users that it's not particularly helpful when they read a product review that's just saying the same thing the manufacturer said about the products or that other sites have already said about the products.

So there's been a variety of different product review updates because I believe that they're refining this set of algorithms to basically elevate the best product reviews on the internet. Google has told us that they should be written by experts or enthusiasts that know the products very well. So people that are obsessed with tech devices, like smart watches or TVs or whatever, they need to prove that they've spent a lot of time analyzing these products, that they have an obsession with it, maybe they studied it, maybe they have pictures of themselves using it, anything that gives the user and search engines evidence that they've actually spent a lot of time with the product.

This is another very important concept from these updates. Google has specifically said, if you're providing product reviews, we need evidence. We need photos of you using the product. We need videos of you using it. We need anything that shows us that you're not just rehashing what everybody else has already said online. We need proof that you've actually spent the time doing it.

So a lot of sites are starting to adapt their product review strategy to meet Google's expectations of what makes a good product review. As a result of this, almost every product review update that rolls out, you'll see a lot of volatility in the search results because some product review sites are winning from these updates, some are losing. Then there are other sites that are being impacted by these ranking changes, such as e-commerce websites, who might see gains or losses in rankings because maybe the product review site that Google was previously ranking was affected by the update, so the e-commerce site wins out a little bit more. This has been a big change for a lot of sites in this category. We've seen a lot of ranking volatility with product review sites.

Core updates

The third type of update that we're all probably very familiar with and has existed for a very long time are Google's core updates. These are nothing new, but the nature of them changes over time. So they pretty much happen quarterly. That's not like 100% true for every year, but it's pretty much every few months that Google will roll out a big core update. They've started to basically just name them after the month. So you might have like the September 22 core update, for example. What makes these tricky is that Google doesn't give us a lot of specificity each time they're rolled out about what changed.

They almost always reference back to the same article that says what site owners should know about Google core updates. That article gives 25 questions or so that the reader or the content creator should read with regard to what makes a good page, a good website. Does the website demonstrate E-A-T? Does the website have good quality content? These are all the questions that you should consider if you've been affected by core updates. Even if you haven't been affected, you should read them because it positions you well to do well when the next core update is rolled out.

Another concept that a lot of people don't understand about core updates is that they often operate on a sitewide level, similar to the helpful content update, which means if Google has determined a large-scale pattern of either great quality content or not good quality content, or perhaps a lack of E-A-T, expertise, authority, and trust in certain areas, a core update can impact the rankings of almost all your content at scale. So that's not necessarily to say that there's one individual article that dropped in rankings because that article is bad. You could actually just be impacted by the core update as a whole because Google decided that your site, in general, shouldn't be ranking as well as it is. So people don't always understand that core updates operate in a sitewide fashion.

Content quality is extremely important during core updates. So if you read Google's questions about the core update, almost all of them tie back to: How much is this website meeting the expectations of users? How much does the content offer something unique that I couldn't get from other people's websites? Is the spelling good? Is the grammar good? Is the usability good? All of this points back to quality.

Technical SEO is also part of content quality. If your website is easy for users and search engines to crawl through and to navigate without terrible page speeds or a bad user interface or things like that, this all factors into their quality evaluations. So it's not just content. It's also technical SEO. It's also performance, usability, website navigation. All these things factor into content quality.

Then intent is the last point I want to make because one thing that I've noticed with my core update analyses is that Google tends to be getting better at understanding user intent. That's not always to say somebody typed "I want to go to this store," like that's a pretty clear intent. When you type a keyword like "dogs," there's a lot of different intents that the user might be looking for. They might be looking to adopt a dog. They might be looking to feed a dog. They might be looking to take a dog on a walk. There are so many different things. Google has so much data that they understand the intent better behind every keyword.

When they launch a core update, you often see that the types of results that are ranking will change. So you might see a dictionary website start to rank during a core update. So let's say the example is dogs. After a core update happens, perhaps a dictionary takes the number one position. That's because Google determined most users want to define what the word "dog" means. If that happens, it's very hard to say that your site did something right or wrong. It's just that Google got better at understanding intent. So that's very important to understand with core updates. It doesn't always mean your site did anything wrong. It could just be that Google is getting smarter.

So with all of that, these updates will probably continue to happen going forward, so you should get a good understanding of how they work, and best of luck to you in your rankings.

Video transcription by Speechpad.com

quarta-feira, 23 de novembro de 2022

The Leading Characteristics of Review Writers, Review Readers, and Successful Owner Responses

Common sense is a useful asset, and as it turns out, it’s a fairly reliable guide when it comes to navigating the big world of online local business reputation. However, for the very first time, thanks to the recent report, The Impact of Local Business Reviews on Consumer Behavior, I was able to test my intuition against original, hard data revealing the habits of real review readers, review writers, and successful owner responses.

I highly recommend reading the full survey analysis, but today, I want to distill that mass of data down into three simple descriptions that emerged through the considerable work of analysis. These three descriptions codify dominant traits, characteristics and behaviors. They are meant to help you envision both the public and practices in an approachable manner, with the proviso that some people and industries will certainly fall outside these norms. For the bulk of local businesses, however, it’s my hope that this synthesis enables you to form a useful mental picture of who and what you’re working with when it comes to growing and managing your reputation.

Leading characteristics of review writers, readers, and successful owner responses infographic.

Review readers are:

Habituated, very trusting unless faced with obvious signals of spam or low quality, much more trusting of other customers than of brands, still highly reliant on real world WOM recommendations, eager for a substantial amount of recent sentiment including negative sentiment, extremely forgiving when problems are resolved, and just one step away from interacting directly with your brand.

The data:

  • Review reading is now a given; 96% of the working age public will read reviews this year to navigate their local landscape. 56% of review readers are highly active daily or weekly readers. Even less active review readers (31%) will turn to reviews monthly or multiple times per year to get local business information.

  • Reviewers spend the majority of their time reading Google-based reviews, but they cite at least a dozen other places where they are regularly reading reviews.

  • With 86% of consumers citing reviews as either the most important or somewhat important signal of whether a business can be trusted, reviews are the most influential sales copy review readers will encounter. In fact, only 11% of consumers say they trust what a business says about itself more than they trust what customers say. 83% of review readers trust reviews as much or more than they did 3 years ago.

  • When choosing between businesses, review readers evaluate the following elements in order of importance: star rating, text content, recency, overall number of reviews, and the presence of owner responses.

  • Review readers are not as demanding as you might think. Only 13% of review readers require a perfect 5-star rating in order to choose a business. In fact, 44% cite flawless ratings as suspicious. 85% will consider a business with an overall 3 to 4-star rating.

  • Review readers filter for recent and negative sentiment first.

  • Review readers want a substantial amount of reading material. 70% will look at 5-20 reviews before considering a business.

  • Review readers’ trust can be lost at a glance. When a local business reviews itself or has suspect profiles reviewing it, or when its star rating or review count is notably low compared to competitors’, trust is eroded and review readers may look elsewhere.

  • Reviews exist on platforms over which businesses have only partial control, but a review readers’ next step lands them back in the brand’s own ball court most of the time, with a combined 91% of readers ending up on the website, at the place of business, or contacting the business directly as their next step. In other words, reviews have added to, but not replaced, traditional shopping behaviors.

  • The tradition of your brand’s good name being on people’s lips also hasn’t changed. 67% of review readers cite the real-world recommendations of friends and family as being their top alternative resource to reading reviews.

Review writers are:

Civic-minded, appreciative, often self-motivated but more frequently in need of prompting, prone to forget to write when they are busy, highly likely to review you if asked via email, text, or face-to-face, active on multiple review platforms, deeply offended by rude service, bad products and incorrect online local business information, very willing to update what they’ve written and give a business a second chance when a complaint is resolved, and a key source of both sales and quality control.

The data:

  • Writing reviews is already a way of life for 41% of your customers who write reviews on a daily, weekly or monthly basis. An additional 44% who will write reviews several times a year may need to be asked, prompted and reminded.

  • 66% spend most of their time writing Google-based reviews, but review writers list at least a dozen other platforms where many spend time leaving sentiment.

  • Review writers say 65% of the negative reviews they write stem from bad/rude customer service. 63% cite a bad product, 52% cite false or incorrect online business info on assets like local business listings, 38% cite low-quality work on a job, 28% cite the failure of the business to resolve complaints in-person, and 28% cite inadequate safety protocols.

  • 73% of review writers are civic-minded, leaving sentiment to benefit their community, 63% write to express appreciation to local businesses, and 38% write to tell a local business that it needs to improve.

  • 39% of review writers haven’t been directly asked to write a review in the past 5 years. If asked, 85% will always, usually or at least sometimes write a review. Just 4% never write reviews in response to requests.

  • 54% of review writers like to be approached via email, 45% prefer person-to-person, and 29% prefer texting.

  • 38% of review writers simply forget to review your business when they have free time. 30% find the review writing process too confusing, 26% don’t believe the business will care enough to read what is written, and 19% are not being directly asked to write a review.

Successful owner responses should:

Happen within a two-hour to two-day time frame to please most reviewers, resolve stated complaints, avoid any type of acrimony, offer thanks for positive feedback and apologies for negative experiences, and be written with exceptional care because they influence 90% of customers to a moderate or extreme degree.

The data:

  • Owner responses influence 90% of customers to a moderate or extreme degree.

  • 60% of customers expect a response to their review within 2 days or less; 11% expect a response within 2 hours, 21% expect a response within 24 hours, and 28% expect a response within 48 hours; 24% say they expect a reply within a week.

  • 54% of customers will definitely avoid a business that is failing to provide a solution to a problem, 46% will definitely avoid a business with an owner who argues with customers in reviews, 47% of consumers will definitely avoid the business when an owner response offers no apology.

  • Only 40% of customers have come to expect thanks for positive reviews. 64% of customers expect a response to negative reviews.

  • 67% of negative reviewers had an improved opinion of a brand when the owner responded well. 62% of negative reviewers would give a business a second chance after an owner response solves their problem. 63% of consumers will update their negative review or low-star rating once an owner response resolves their complaint.

In conclusion

Any local business which is founded on a customer-centric and employee-centric model already has a built-in advantage when it comes to managing the offline experiences that form the online brand narrative. Shoppers and staff simply want to be treated fairly and well. Local companies that meet these criteria in-store are capable of utilizing the same skills online, where digital sentiment has become like the front porch on a general store – a meeting, greeting, and helping spot for the community.

Local business owners and their marketers may need to invest in a few new tools to hang out on that porch effectively - think of them as the awning or wood stove you install to facilitate maximum comfort for everybody. But the skills that bring these tools to life are the ones the best local entrepreneurs already know - respect, attentiveness, accountability, empathy, responsiveness. Now we have the data to prove that the common sense approach of treating everyone well is actually very good business.

Hungry for more review data? Read: The Impact of Local Business Reviews on Consumer Behavior.

segunda-feira, 21 de novembro de 2022

A Different Way of Thinking About Core Updates

These days, Google algorithm updates seem to come in two main flavors. There’s very specific updates — like the Page Experience Update or Mobile-Friendly Update — which tend to be announced well in advance, provide very specific information on how the ranking factor will work, and finally arrive as a slight anti-climax. I’ve spoken before about the dynamic with these updates. They are obviously intended to manipulate the industry, and I think there is also a degree to which they are a bluff.

This post is not about those updates, though, it is about the other flavor. The other flavor of updates is the opposite: they are announced when they are already happening or have happened, they come with incredibly vague and repetitive guidance, and can often have cataclysmic impact for affected sites.

Coreschach tests

Since March 2018, Google has taken to calling these sudden, vague cataclysms “Core Updates”, and the type really gained notoriety with the advent of “Medic” (an industry nickname, not an official Google label), in August 2018. The advice from Google and the industry alike has evolved gradually over time in response to changing Quality Rater guidelines, varying from the exceptionally banal (“make good content”) to the specific but clutching at straws (“have a great about-us page”). To be clear, none of this is bad advice, but compared to the likes of the Page Experience update, or even the likes of Panda and Penguin, it demonstrates an extremely woolly industry picture of what these updates actually promote or penalize. To a degree, I suspect Core Updates and the accompanying era of “EAT” (Expertise, Authoritativeness, and Trust) have become a bit of a Rorschach test. How does Google measure these things, after all? Links? Knowledge graphs? Subjective page quality? All the above? Whatever you want to see?

If I am being somewhat facetious there, it is born out of frustration. As I say, (almost) none of the speculation, or the advice it results in, is actually bad. Yes, you should have good content written by genuinely expert authors. Yes, SEOs should care about links. Yes, you should aim to leave searchers satisfied. But if these trite vagaries are what it takes to win in Core Updates, why do sites that do all these things better than anyone, lose as often as they win? Why does almost no site win every time? Why does one update often seem to undo another?

Roller coaster rides

This is not just how I feel about it as a disgruntled SEO — this is what the data shows. Looking at sites affected by Core Updates since and including Medic in MozCast, the vast majority have mixed results.

Meanwhile, some of the most authoritative original content publishing sites in the world actually have a pretty rocky ride through Core Updates.

I should caveat: this is in the MozCast corpus only, not the general performance of Reuters. But still, these are real rankings, and each bar represents a Core Update where they have gone up or down. (Mostly, down.) They are not the only ones enjoying a bumpy ride, either.

The reality is that pictures like this are very common, and it’s not just spammy medical products like you might expect. So why is it that almost all sites, whether they be authoritative or not, sometimes win, and sometimes lose?

The return of the refresh

SEOs don’t talk about data refreshes anymore. This term was last part of the regular SEO vocabulary in perhaps 2012.

This was the idea that major ranking fluctuation was sometimes caused by algorithm updates, but sometimes simply by data being refreshed within the existing algorithm — particularly if this data was too costly or complex to update in real time. I would guess most SEOs today assume that all ranking data is updated in real time.

But, have a look at this quote from Google’s own guidance on Core Updates:

“Content that was impacted by one might not recover—assuming improvements have been made—until the next broad core update is released.”

Sounds a bit like a data refresh, doesn’t it? And this has some interesting implications for the ranking fluctuations we see around a Core Update.

If your search competitor makes a bunch of improvements to their site, then when a Core Update comes round, under this model, you will suddenly drop. This is no indictment of your own site, it’s just that SEO is often a zero sum game, and suddenly a bunch of improvements to other sites are being recognized at once. And if they go up, someone must come down.

This kind of explanation sits easily with the observed reality of tremendously authoritative sites suffering random fluctuation.

Test & learn

The other missing piece of this puzzle is that Google acknowledges its updates as tests:

This sounds, at face value, like it is incompatible with the refresh model implied by the quote in the previous section. But, not necessarily — the tests and updates referred to could in fact be happening between Core Updates. Then the update itself simply refreshes the data and takes in these algorithmic changes at the same time. Or, both kinds of update could happen at once. Either way, it adds to a picture where you shouldn’t expect your rankings to improve during a Core Update just because your website is authoritative, or more authoritative than it was before. It’s not you, it’s them.

What does this mean for you?

The biggest implication of thinking about Core Updates as refreshes is that you should, essentially, not care about immediate before/after analysis. There is a strong chance that you will revert to mean between updates. Indeed, many sites that lose in updates nonetheless grow overall.

The below chart is the one from earlier in this post, showing the impact of each Core Update on the visibility of www.reuters.com (again — only among MozCast corpus keywords, not representative of their total traffic). Except, this chart also has a line showing how the total visibility nonetheless grew despite these negative shocks. In other words, they more than recovered from each shock, between shocks.

Under a refresh model, this is somewhat to be expected. Whatever short term learning the algorithm does is rewarding this site, but the refreshes push it back to an underlying algorithm, which is less generous. (Some would say that that short term learning could be driven by user behavior data, but that’s another argument!)

The other notable implication is that you cannot necessarily judge the impact of an SEO change or tweak in the short term. Indeed, causal analysis in this world is incredibly difficult. If your traffic goes up before a Core Update, will you keep that gain after the update? If it goes up, or even just holds steady, through the update, which change caused that? Presumably you made many, and equally relevantly, so did your competitors.

Experience

Does this understanding of Core Updates resonate with your experience? It is, after all, only a theory. Hit us up on Twitter, we’d love to hear your thoughts!

sexta-feira, 18 de novembro de 2022

Advanced On-Page SEO Optimizations — Whiteboard Friday

Typically, when SEOs think about on-page optimizations, they’re thinking about core places to include their target keywords within their content. But how can you take your on-page optimizations to the next level and get beyond some of those basic tactics? In today’s episode of Whiteboard Friday, Chris Long shows you how.

whiteboard outlining advanced on-page SEO optimizations

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. I'm Chris Long with Go Fish Digital, and today we're going to talk about advanced on-page optimizations. Commonly in SEO, when we think about on-page optimizations, we're typically thinking about core places to include the keywords, such as the title, the H1, the URL within the content. But some people might be wondering, how can you take your on-page optimizations to the next level and get beyond some of those basic tactics? So that's what I want to cover today.

Entities

So one of the best ways I found to shift away from the keyword mindset is actually to shift to more of an entity mindset. So, for an example, if you're going to optimize a page for the term "retire early," instead of using the term "retire early" a bunch of times on the page, you could use tools like IBM Watson or Google Natural Language. Both of those have public-facing tools that you can run any text document you want through. And if you ran it from a result like Investopedia, you might see that "retire early" comes up with a strategy or entity such as Vicki Robin and Joe Dominguez, two of the top authors about "retire early."

As well, when you take a look at competitor pages and have more of an entity mindset in mind, instead of just thinking about how many times they're using a keyword on the page, you're thinking more strategically about common topics and themes you want to integrate within your own website content.

E-A-T

Another great way to take your on-page optimization to the next level is this concept of E-A-T — expertise, authoritativeness, and trustworthiness. One of the best ways to improve the expertise of your site content is just to simply look at your author biographies. A lot of sites still get this wrong. And when you look at author biographies, you should always be thinking about, hey, where can I highlight my years of experience, my education, my previous experience, my thought leadership directly within your author biographies to better highlight your expertise to both Google and users.

As well, another thing I love to think about with on-page optimizations is this concept of information gain scores. It's one of my favorite patents, analyzed by Bill Slawski, where he talks about the fact that Google looks to reward content that adds to the search results and doesn't just repeat what's already out there. So think about where you can leverage your own unique expertise, data, and insights to benefit from this concept of information gain scores.

Another great way to improve the E-A-T of your site's content is to actually cite sources. The Wirecutter is phenomenal at doing this. Any time they cite an individual fact, they actually cite where they got that fact and link to external, trusted, accredited sources to verify where they're finding that information from. Another great way to improve the trustworthiness of your content and take your on-page optimizations to the next level.

Freshness

Another strategy that I think is highly, highly underrated is this concept of freshness. We've actually run tests on our own site, and we pretty consistently see that when we do things like update timestamps or just refresh content, we see noticeable upticks in both rankings and visibility and traffic. And I think this makes sense from kind of multiple perspectives when you really start to think about it. From a trustworthiness standpoint, if Google thinks the content is outdated, well, it's hard for it to trust that the information is actually accurate within the article. As well, from a competitive standpoint, it's very hard for Google to compete in terms of real-time results. That's why users might go to platforms like Twitter instead of Google. However, in recent years, Google is making a push toward to include live blog-posting-type URLs in top stories. I think they're trying to incentivize publishers to update their content in real time to set the expectation that users can get real-time information on Google instead of just Twitter.

Historical competitor changes

Another great way of thinking about your on-page optimizations is this concept of historical competitor changes. Oftentimes, when we think about our on-page optimizations, we're only thinking about what competitors are doing in the given moment, but we're not telling the story of how they've changed their on-page optimizations in order to get to that point. So you can do this type of analysis for really competitive queries. What I like to do is find a strong competitor that's actually improved in the rankings in recent years and then take that page and actually run it back through the Wayback Machine, and see which on-page changes have been made over time, what content they're adding. What are they removing, and what are they keeping the same? And that can help you better isolate what the most prominent on-page changes competitors have made have been.

Another great strategy to use is to use a text diff compare tool. You can actually take an old version of text and then compare that against the current version of text, run that through a tool, and the tool will actually highlight all of the changes competitors are making. That makes it very easy for you to find what on-page strategies your competitors are utilizing.

Keyword segmentation

The final aspect of advanced on-page optimizations I want to talk about is this concept of keyword segmentation. We segment our traffic data in Google Analytics all the time, but we don't segment our keyword data in the same way. So using tools like STAT, we can actually create keyword segments any time we do some type of on-page optimization. If we update entities, if we update freshness, if we update EAT, we can create keyword segments in all of those different instances. And then, over time, we can compare the segments against each other and measure what the most important ones have been. That will actually give you better data about what type of on-page optimizations work best for your specific sites.

So, hopefully, that's been helpful. Hopefully, you'll walk out of here with some more strategies and concrete takeaways. Now you can improve your on-page optimizations and take them to the next level. Thanks a lot Moz fans.

Video transcription by Speechpad.com

quarta-feira, 16 de novembro de 2022

Announcing: The Impact of Local Business Reviews on Consumer Behavior | SEO Industry Report

A warm welcome to Moz’s first large-scale survey on the habits of local business review readers, review writers and successful owner responses. Our survey uncovered interesting insights and actions local businesses can take to better serve their customers, earn more reviews, and build relationships. Read our free report today to peruse the findings, our thoughtful analysis, and expert commentary from local SEO industry professionals.

Read the report!

We surveyed 1,000+ US residents to assess trending behaviors surrounding reviews and responses and gained a powerful picture of the role this type of online sentiment is playing in consumer journeys, conversions, and most importantly, reputation. Local business owners and their marketers can access the full report today for strategic takeaways like these:

Only 11% of consumers trust brand messaging over public sentiment

In the complete report, you’ll learn that 96% of consumers now read online local business reviews. That’s basically almost everyone accessing this type of content, and the context comes into high relief when you know that just 11% of the public trusts what brands say about themselves over what customers say. Review management deserves serious investment from local businesses because it is the customers who are now writing the most trusted brand narratives.

52% of respondents say their negative reviews stem from false or incorrect online information on assets like local business listings

Local business listing management also needs to sit at the core of your marketing strategy because, without it, negative customer experiences in the real world result in negative online reviews. When neglect of listings leads to incorrect contact information existing around the web, customers are significantly inconvenienced by driving to wrong locations, calling outdated phone numbers, or arriving on the premises outside of accurate business hours.

With over half of customers having written negative reviews following poor experiences like these, it’s clear that location data management is essential to customer service and is key to protecting your reputation.

91% of consumers’ next steps after reading reviews occur in areas completely controlled by the business

Local search can be a powerful connector between brands and consumers, but it can also sometimes lead to feelings of a loss of control. While business owners and marketers can be part of the conversation in big spaces like review platforms and social media, they can’t directly control it.

This is why it should come as such welcome news that the incredibly broad road of review readers lands the majority of customers right back into areas directly controlled by the business. As the next step after reading reviews, 51% of consumers visit your website, 27% go directly to your place of business, and 13% contact you. It turns out that you have significant control over customer experiences along the post-review-reading customer journey.

The top reason customers don’t review your business is because they forget to

As you dig deep into Moz’s complete survey findings, you will come to identify a leading consumer desire for a substantial number of recent reviews. It’s this trend that obliges local business owners to implement review acquisition campaigns so that fresh sentiment is always incoming.

It’s a welcome insight to know that 38% of customers don’t leave you a review because they simply forget to when they have free time. This is the top reason, amongst many, explaining why you likely aren’t receiving as many reviews as you need to. Fortunately, a remedy is within easy reach with follow-through reminders to review your business being helpfully shared with customers via email, text, and print assets. You can get more reviews if you just keep communicating.

62% of negative reviewers would give a local brand a second chance after an owner response solves their problem

As you move through the complete report, you’ll come to see the medium of reviews as a platform for two-way conversations, with the majority of customers who leave a negative review expecting to hear back quickly from the business owner. It's harder to imagine better tidings than that 62% of your customers are willing to give your company a second chance if your owner response successfully resolves their complaints.

This figure transforms scary narratives surrounding negative reviews into moments within a relationship where forgiveness is likely to follow when real help is given. A complete local search marketing campaign must include ongoing hands-on responsiveness to online customer sentiment.

Come get the keys to running a customer-centric local business

As we’ve learned, reviews are a wide road almost all of your potential and current customers are traveling on. To fully charge your vehicle for best performance on that highway, local business review stats and trends can help you better serve customers by understanding their needs; implement structural fixes within your business based on problems cited by consumers; earn more reviews to improve your local pack rankings and conversions; and build loyal community relationships via two-way conversations.

Reading The Impact of Local Business Reviews on Consumer Behavior will help you prioritize reputation management tasks on the basis of consumer demand and habits. It will give you access to expert commentary from industry leaders including Aaron Weiche, Amy Toman, Crystal Carter, Joy Hawkins, and Mike Blumenthal. And, it will be a resource you can share with multiple stakeholders, be they clients, staff, team members, or company leadership to get buy-in for the considerable work involved in professionally managing reviews. There’s nothing quite like good data to make a great point, so please come take this ride with us!

Read: The Impact of Local Business Reviews on Consumer Behavior | SEO Industry Report

segunda-feira, 14 de novembro de 2022

How to Optimize for Google's Featured Snippets [Updated for 2022]

Google’s featured snippets started as an experiment almost a decade ago.

They have since become an integral part of Google’s SERPs, showing up for lots of queries.

In fact, featured snippets are now considered organic position #1, so it is part of any SEO strategy.

What are featured snippets?

Featured snippets are selected search results that are featured on top of Google's organic results below the ads in a box.

Featured snippets aim at answering the user's question right away (hence their other well-known name, "answer boxes").

The recent studies reveal that featured snippets have an average 35% click-through rate.

Being featured means being on top of everything (except for ads), in the most prominent spot:

Types of featured snippets

There are three major types of featured snippets:

  • Paragraph (an answer is given in text).

  • List (an answer is given in a form of a list)

  • Table (an answer is given in a table)

Each type can also include an image, and that image may come from a third-party page that is not featured. There may be 2 images included inside the featured box:

An older study from STAT: the most popular featured snippet is "paragraph" type.

Featured snippets or answer boxes?

The terminology may still be pretty loose here. Many people (including myself) are inclined to refer to featured snippets as "answer boxes," obviously because there's an answer presented in a box.

While there's nothing wrong with this terminology, it creates a certain confusion because Google often gives a "quick answer" (a definition, an estimate, etc.) on top without linking to the source:

To avoid confusion, let's stick to the "featured snippet" term whenever there's a URL featured in the box, because these present an extra exposure to the linked site (hence they're important for content publishers):

Do I have a chance to get featured?

Yes.

According to another older research by Ahrefs, about 100% of featured pages already rank in top 10 of Google. So if you are already ranking in top 10 for related search queries, you have very good chances to get featured.

Featured snippets appear and disappear for the same queries but you have higher chances to get featured if there’s already a featured snippet showing up for your target query (i.e. Google has already identified search intent for your query as informational).

Obviously, based on the purpose of the search section (i.e. to give a quick answer), you have a higher chance of getting featured if you answer a lot of questions in your content.

Identify all kinds of opportunities to be featured

Start with good old keyword research

Multiple studies confirm that the majority of featured snippets are triggered by long-tail keywords. In fact, the more words that are typed into a search box, the higher the probability there will be a featured snippet.

It's always a good idea to start with researching your keywords. Moz’s Keyword Explorer is a good place to start.

When performing keyword research with featured snippets in mind, note that:

  • Start with question-type search queries (those containing question words, like “what,” “why,” “how,” etc.) because these are the easiest to identify, but don’t stop there...

  • Target informational intent, not just questions. While featured snippets aim at answering the user’s question immediately, question-type queries are not the only types that trigger those featured results. According to the aforementioned Ahrefs study, the vast majority of keywords that trigger featured snippets were long-tail queries with no question words in them.

It helps if you use a keyword research tool that shows immediately whether a query triggers featured results. SE Ranking offers a nice filter allowing you to see keywords that are currently triggering featured snippets:

You can also run your competitor in Serpstat and then filter their best-performing queries by the presence of featured snippets.

This is a great overview of your future competition, enabling you to see your competitors' strengths and weaknesses.

Browse Google for more questions

To further explore the topic, be sure to research popular niche questions.

Tools like Buzzsumo and Text Optimizer can give you a good insight into questions people tend to ask around your topic:

Identify search queries where you already rank high

Your lowest-hanging fruit is to identify which phrases you already rank highly for. These will be the easiest to get featured for after you optimize for answer boxes (more on this below).

Google Search Console shows which search queries send you clicks. To find that report,

  • Click "Performance" .

  • Check the box to show the position your pages hold for each one and you'll have the ability to see which queries are your top-performing ones:

Note that Search Console labels featured snippet positions as #1 (SEO used to call them position 0). So when you see #1 in Google Search Console, there’s nothing to do here. Focus on #2 and lower.

You can then use the filters to find some question-type queries among those:

Go beyond traditional keyword research tools: Ask people

All the above methods (albeit great) tackle already discovered opportunities: those for which you or your competitors are already ranking high. But how about venturing beyond that? Ask your readers, customers, and followers how they search and which questions they ask.

MyBlogU: Ask people outside your immediate reach

Move away from your target audience and ask random people what questions they have on a specific topic and what would be their concerns. Looking out of the box can always give a fresh perspective.

MyBlogU (disclaimer: I am the founder) is a great way to do that. Just post a new project in the " Brainstorm" section and ask members to contribute their thoughts.

Seed Keywords: Ask your friends and followers

Seed Keywords is a simple tool that allows you to discover related keywords with help from your friends and followers. Simply create a search scenario, share it on social media, and ask your followers to type in the keywords they would use to solve it.

Try not to be too leading with your search scenario. Avoid guiding people to the search phrase you think they should be using.

Here's an example of a scenario:

And here are the suggestions from real people:

Obviously, you can also create similar surveys with tools like WP Forms or Google Forms.

Organize questions and keywords

I use spreadsheets to organize questions and keyword phrases I discover (see more on this below). Some of these questions may become a whole piece of content, while others will be subsections of broader articles:

  • I don’t try to analyze search volume to decide whether any of those questions deserve to be covered in a separate article or a subsection. (Based on the Ahrefs research and my own observations, there is no direct correlation between the popularity of the term and whether it will trigger a featured snippet).

  • Instead, I use my best judgment (based on my niche knowledge and research) as to how much I will be able to tell to answer each particular question. If it’s a lot, I’ll probably turn it into a separate article and use keyword research to identify subsections of the future piece.

Optimizing for featured snippets

Start with on-page SEO

There is no magic button or special markup which will make sure your site gets featured. Of course, it's a good idea to start with non-specific SEO best practices, simply because being featured is only possible when you rank high for the query.

Randy Milanovic did a good overview of tactics of making your content findable. Eric Brantner over at Coschedule has put together a very useful SEO checklist, and of course never forget to go through Moz’s SEO guide.

That being said, the best way to get featured is to provide a better answer. Here are a few actionable tips:

1. Aim at answering each question concisely

My own observation of answer boxes has led me to think that Google prefers to feature an answer which was given within one paragraph.

An older study by AJ Ghergich cites that the average length of a paragraph snippet is 45 words (the maximum is 97 words), so let it be your guideline as to how long each answer should be in order to get featured.

This doesn't mean your articles need to be one paragraph long. On the contrary, these days Google seems to give preference to long-form content (also known as " cornerstone content," which is obviously a better way to describe it because it's not just about length) that's broken into logical subsections and features attention-grabbing images. 

Even if you don’t believe that cornerstone content receives any special treatment in SERPs, focusing on long articles will help you to cover more related questions within one piece (more on that below).

All you need to do is to adjust your blogging style just a bit:

  • Ask the question in your article (that may be a subheading)

  • Immediately follow the question with a one-paragraph answer

  • Elaborate further in the article

This tactic may also result in higher user retention because it makes any article better structured and thus a much easier read. To quote AJ Ghergich,

When you use data to fuel topic ideation, content creation becomes more about resources and less about brainstorming.

2. Be factual and organize well

Google loves numbers, steps and lists. We've seen this again and again: More often than not, answer boxes will list the actual ingredients, number of steps, time to cook, year and city of birth, etc.

Use Google’s guide on writing meta descriptions to get a good idea what kind of summaries and answers they are looking to generate snippets (including featured snippets).

Google loves well-structured, factual, and number-driven content.

There's no specific markup to structure your content. Google seems to pick up <table>, <ol>, and <ul> well and doesn't need any other pointers. Using H2 and H3 subheadings will make your content easier to understand for both Google and your readers.

3. Make sure one article answers many related questions

Google is very good at determining synonymic and closely related questions, so should be you. There's no point in creating a separate page answering each specific question.

Creating one solid article addressing many related questions is a much smarter strategy if you aim at getting featured in answer boxes. This leads us to the next tactic:

4. Organize your questions properly

To combine many closely related questions in one article, you need to organize your queries properly. This will also help you structure your content well.

I have a multi-level keyword organization strategy that can be applied here as well:

  • A generic keyword makes a section or a category of the blog

  • A more specific search query becomes the title of the article

  • Even more specific queries determine the subheadings of the article and thus define its structure
    • There will be multiple queries that are so closely related that they will all go under a single subheading

For example:

Serpstat helps me a lot when it comes to both discovering an article idea and then breaking it into subtopics. Check out its " Questions" section. It will provide hundreds of questions containing your core term and then generate a tag cloud of other popular terms that come up in those questions:

Clicking any word in the tag cloud will filter results down to those questions that only have that word in them. These are subsections for your article.

Here’s another good guide on identifying your keyword modifiers (groups) and using those to structure your content.

Here's a good example of how related questions can help you structure the article:

5. Make sure to use eye-grabbing images

Paragraph featured snippets with images are ridiculously eye-catching, even more so than regular featured snippets. Honestly, I wasn't able to identify how to add an image so that it's featured. I tried naming it differently and I tried marking it as "featured" in the Wordpress editor. Google seems to pick up a random image from the page without me being able to point it to a better version.

That being said, the only way to influence that is to make sure ALL your in-article images are eye-catching, branded, and annotated well, so that no matter which one Google ends up featuring, it will look nice. 

Optimizing and branding your images well is crucial for featured snippet optimization because images are often included in featured boxes, and in many cases those images come from different domains.

Clicking images within featured images enlarges that image inviting the user to go to the linked site. In other words, this can be a traffic-building opportunity for non-featured sites.

Google is pulling these images from Google Images search results, so image optimization is important for driving traffic from featured snippets.

Also don’t forget to update and re-upload the images (on Wordpress). Wordpress adds dates to image URLs, so even if you update an article with newer information the images can be considered kind of old.

Monitor your progress

You are already monitoring your organic positions, and featured snippets are tracked as #1 position these days.

For your most important keywords, you may want to set up closer monitoring to be alerted when Google changes anything:

How about structured markup?

Many people would suggest using Schema.org (simply because it's been a "thing" to recommend adding schema for anything and everything) but the aforementioned Ahrefs study shows that there's no correlation between featured results and structured markup.

Conclusion

It takes a lot of research and planning and you cannot be sure when you'll see the results (especially if you don't have too many top 10 rankings just yet) but think about this way: Being featured in Google search results is your incentive to work harder on your content. You'll achieve other important goals on your way there:

  • You'll discover hundreds of new content ideas (and thus will rank for a wider variety of various long-tail keywords)

  • You'll learn to research each topic more thoroughly (and thus will build more incoming links because people tend to link to in-depth articles)

  • You'll learn to structure your articles better (and thus achieve a lower bounce rate because it will be easier to read your articles)

Update: We have released a featured snippet optimization tool. With it, you can see exactly what your featured snippet opportunities are and what it may take to grab each of them (based on where the featured page ranks organically, where your page ranks and what type of featured snippet to optimize for.)