sexta-feira, 28 de fevereiro de 2020

The Rules of Link Building - Best of Whiteboard Friday

Posted by BritneyMuller

Are you building links the right way? Or are you still subscribing to outdated practices? Britney Muller clarifies which link building tactics still matter and which are a waste of time (or downright harmful) in one of our very favorite classic episodes of Whiteboard Friday.

The Rules of Link Building

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Happy Friday, Moz fans! Welcome to another edition of Whiteboard Friday. Today we are going over the rules of link building. It's no secret that links are one of the top three ranking factors in Google and can greatly benefit your website. But there is a little confusion around what's okay to do as far as links and what's not. So hopefully, this helps clear some of that up.

The Dos

All right. So what are the dos? What do you want to be doing? First and most importantly is just to...

I. Determine the value of that link. So aside from ranking potential, what kind of value will that link bring to your site? Is it potential traffic? Is it relevancy? Is it authority? Just start to weigh out your options and determine what's really of value for your site. Our own tool, Moz Link Explorer, can 

II. Local listings still do very well. These local business citations are on a bunch of different platforms, and services like Moz Local or Yext can get you up and running a little bit quicker. They tend to show Google that this business is indeed located where it says it is. It has consistent business information — the name, address, phone number, you name it. But something that isn't really talked about all that often is that some of these local listings never get indexed by Google. If you think about it, Yellowpages.com is probably populating thousands of new listings a day. Why would Google want to index all of those?

So if you're doing business listings, an age-old thing that local SEOs have been doing for a while is create a page on your site that says where you can find us online. Link to those local listings to help Google get that indexed, and it sort of has this boomerang-like effect on your site. So hope that helps. If that's confusing, I can clarify down below. Just wanted to include it because I think it's important.

III. Unlinked brand mentions. One of the easiest ways you can get a link is by figuring out who is mentioning your brand or your company and not linking to it. Let's say this article publishes about how awesome SEO companies are and they mention Moz, and they don't link to us. That's an easy way to reach out and say, "Hey, would you mind adding a link? It would be really helpful."

IV. Reclaiming broken links is also a really great way to kind of get back some of your links in a short amount of time and little to no effort. What does this mean? This means that you had a link from a site that now your page currently 404s. So they were sending people to your site for a specific page that you've since deleted or updated somewhere else. Whatever that might be, you want to make sure that you 301 this broken link on your site so that it pushes the authority elsewhere. Definitely a great thing to do anyway.

V. HARO (Help a Reporter Out). Reporters will notify you of any questions or information they're seeking for an article via this email service. So not only is it just good general PR, but it's a great opportunity for you to get a link. I like to think of link building as really good PR anyway. It's like digital PR. So this just takes it to the next level.

VI. Just be awesome. Be cool. Sponsor awesome things. I guarantee any one of you watching likely has incredible local charities or amazing nonprofits in your space that could use the sponsorship, however big or small that might be. But that also gives you an opportunity to get a link. So something to definitely consider.

VII. Ask/Outreach. There's nothing wrong with asking. There's nothing wrong with outreach, especially when done well. I know that link building outreach in general kind of gets a bad rap because the response rate is so painfully low. I think, on average, it's around 4% to 7%, which is painful. But you can get that higher if you're a little bit more strategic about it or if you outreach to people you already currently know. There's a ton of resources available to help you do this better, so definitely check those out. We can link to some of those below.

VIII. COBC (create original badass content). We hear lots of people talk about this. When it comes to link building, it's like, "Link building is dead. Just create great content and people will naturally link to you. It's brilliant." It is brilliant, but I also think that there is something to be said about having a healthy mix. There's this idea of link building and then link earning. But there's a really perfect sweet spot in the middle where you really do get the most bang for your buck.

The Don'ts

All right. So what not to do. The don'ts of today's link building world are...

I. Don't ask for specific anchor text. All of these things appear so spammy. The late Eric Ward talked about this and was a big advocate for never asking for anchor text. He said websites should be linked to however they see fit. That's going to look more natural. Google is going to consider it to be more organic, and it will help your site in the long run. So that's more of a suggestion. These other ones are definitely big no-no's.

II. Don't buy or sell links that pass PageRank. You can buy or sell links that have a no-follow attached, which attributes that this is paid-for, whether it be an advertisement or you don't trust it. So definitely looking into those and understanding how that works.

III. Hidden links. We used to do this back in the day, the ridiculous white link on a white background. They were totally hidden, but crawlers would pick them up. Don't do that. That's so old and will not work anymore. Google is getting so much smarter at understanding these things.

IV. Low-quality directory links. Same with low-quality directory links. We remember those where it was just loads and loads of links and text and a random auto insurance link in there. You want to steer clear of those.

V. Site-wide links also look very spammy. Site-wide being whether it's a footer link or a top-level navigation link, you definitely don't want to go after those. They can appear really, really spammy. Avoid those.

VI. Comment links with over-optimized anchor link text, specifically, you want to avoid. Again, it's just like any of these others. It looks spammy. It's not going to help you long-term. Again, what's the value of that overall? So avoid that.

VII. Abusing guest posts. You definitely don't want to do this. You don't want to guest post purely just for a link. However, I am still a huge advocate, as I know many others out there are, of guest posting and providing value. Whether there be a link or not, I think there is still a ton of value in guest posting. So don't get rid of that altogether, but definitely don't target it for potential link building opportunities.

VIII. Automated tools used to create links on all sorts of websites. ScrapeBox is an infamous one that would create the comment links on all sorts of blogs. You don't want to do that.

IX. Link schemes, private link networks, and private blog networks. This is where you really get into trouble as well. Google will penalize or de-index you altogether. It looks so, so spammy, and you want to avoid this.

X. Link exchange. This is in the same vein as the link exchanges, where back in the day you used to submit a website to a link exchange and they wouldn't grant you that link until you also linked to them. Super silly. This stuff does not work anymore, but there are tons of opportunities and quick wins for you to gain links naturally and more authoritatively.

So hopefully, this helps clear up some of the confusion. One question I would love to ask all of you is: To disavow or to not disavow? I have heard back-and-forth conversations on either side on this. Does the disavow file still work? Does it not? What are your thoughts? Please let me know down below in the comments.

Thank you so much for tuning in to this edition of Whiteboard Friday. I will see you all soon. Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

quarta-feira, 26 de fevereiro de 2020

How Low Can #1 Go? (2020 Edition)

Posted by Dr-Pete

Being #1 on Google isn't what it used to be. Back in 2013, we analyzed 10,000 searches and found out that the average #1 ranking began at 375 pixels (px) down the page. The worst case scenario, a search for "Disney stock," pushed #1 all the way down to 976px.

A lot has changed in seven years, including an explosion of rich SERP (Search Engine Results Page) features, like Featured Snippets, local packs, and video carousels. It feels like the plight of #1 is only getting worse. So, we decided to run the numbers again (over the same searches) and see if the data matches our perceptions. Is the #1 listing on Google being pushed even farther down the page?

I try to let the numbers speak for themselves, but before we dig into a lot of stats, here's one that legitimately shocked me. In 2020, over 1,600 (16.6%) of the searches we analyzed had #1 positions that were worse than the worst-case scenario in 2013. Let's dig into a few of these ...

What's the worst-case for #1?

Data is great, but sometimes it takes the visuals to really understand what's going on. Here's our big "winner" for 2020, a search for "lollipop" — the #1 ranking came in at an incredible 2,938px down. I've annotated the #1 position, along with the 1,000px and 2,000px marks ...

At 2,938px, the 2020 winner comes in at just over three times 2013's worst-case scenario. You may have noticed that the line is slightly above the organic link. For the sake of consistency and to be able to replicate the data later, we chose to use the HTML/CSS container position. This hits about halfway between the organic link and the URL breadcrumbs (which recently moved above the link). This is a slightly more conservative measure than our 2013 study.

You may also have noticed that this result contains a large-format video result, which really dominates page-one real estate. In fact, five of our top 10 lowest #1 results in 2020 contained large-format videos. Here's the top contender without a large-format video, coming in at fourth place overall (a search for "vacuum cleaners") ...

Before the traditional #1 organic position, we have shopping results, a research carousel, a local pack, People Also Ask results, and a top products carousel with a massive vertical footprint. This is a relentlessly commercial result. While only a portion of it is direct advertising, most of the focus of the page above the organic results is on people looking to buy a vacuum.

What about the big picture?

It's easy — and more than a little entertaining — to cherry-pick the worst-case scenarios, so let's look at the data across all 10,000 results. In 2013, we only looked at the #1 position, but we've expanded our analysis in 2020 to consider all page-one organic positions. Here's the breakdown ...

The only direct comparison to 2013 is the position #1 row, and you can see that every metric increased, some substantially. If you look at the maximum Y-position by rank, you'll notice that it peaks around #7 and then begins to decrease. This is easier to illustrate in a chart ...

To understand this phenomenon, you have to realize that certain SERP features, like Top Stories and video carousels, take the place of a page-one organic result. At the same time, those features tend to be longer (vertically) than a typical organic result. So, a page with 10 traditional organic results will in many cases be shorter than a page with multiple rich SERP features.

What's the worst-case overall?

Let's dig into that seven-result page-one bucket and look at the worst-case organic position across all of the SERPs in the study, a #7 organic ranking coming in at 4,487px ...

Congratulations, you're finally done scrolling. This SERP has seven traditional organic positions (including one with FAQ links), plus an incredible seven rich features and a full seven ads (three are below the final result). Note that this page shows the older ad and organic design, which Google is still testing, so the position is measured as just above the link.

How much do ads matter?

Since our 2013 study (in early 2016), Google removed right-hand column ads on desktop and increased the maximum number of top-left ads from three to four. One notable point about ads is that they have prime placement over both organic results and SERP features. So, how does this impact organic Y-positions? Here's a breakdown ...

Not surprisingly, the mean and median increase as ad-count increases – on average, the more ads there are, the lower the #1 organic position is. So why does the maximum Y-position of #1 decrease with ad-count? This is because SERP features are tied closely to search intent, and results with more ads tend to be more commercial. This naturally rules out other features.

For example, while 1,270 SERPs on February 12 in our 10,000-SERP data set had four ads on top, and 1,584 had featured snippets, only 16 had both (just 1% of SERPs with featured snippets). Featured snippets naturally reflect informational intent (in other words, they provide answers), whereas the presence of four ads signals strong commercial intent.

Here's the worst-case #1 position for a SERP with four ads on top in our data set ...

The college results are a fairly rare feature, and local packs often appear on commercial results (as anyone who wants to buy something is looking for a place to buy it). Even with four ads, though, this result comes in significantly higher than our overall worst-case #1 position. While ads certainly push down organic results, they also tend to preclude other rich SERP features.

What about featured snippets?

In early 2014, a year after our original study, Google launched featured snippets, promoted results that combine organic links with answers extracted from featured pages. For example, Google can tell you that I am both a human who works for Moz and a Dr. Pepper knock-off available at Target ...

While featured snippets are technically considered organic, they can impact click-through rates (CTR) and the extracted text naturally pushes down the organic link. On the other hand, Featured Snippets tend to appear above other rich SERP features (except for ads, of course). So, what's the worst-case scenario for a #1 result inside a featured snippet in our data set?

Ads are still pushing this result down, and the bullet list extracted from the page takes up a fair amount of space, but the absence of other SERP features above the featured snippet puts this in a much better position than our overall worst-case scenario. This is an interesting example, as the "According to mashable.com ..." text is linked to Mashable (but not considered the #1 result), but the images are all linked to more Google searches.

Overall in our study, the average Y-position of #1 results with featured snippets was 99px lower/worse (704px) than traditional #1 results (605px), suggesting a net disadvantage in most cases. In some cases, multiple SERP features can appear between the featured snippet and the #2 organic result. Here's an example where the #1 and #2 result are 1,342px apart ...

In cases like this, it's a strategic advantage to work for the featured snippet, as there's likely a substantial drop-off in clicks from #1 to #2. Featured snippets are going to continue to evolve, and examples like this show how critical it is to understand the entire landscape of your search results.

When is #2 not worth it?

Another interesting case that's evolved quite a bit since 2013 is brand searches, or as Google is more likely to call them, "dominant intent" searches. Here's a SERP for the company Mattress Firm ...

While the #1 result has solid placement, the #2 result is pushed all the way down to 2,848px. Note that the #1 position has a search box plus six full site-links below it, taking up a massive amount of real estate. Even the brand's ad has site-links. Below #1 is a local pack, People Also Ask results, Twitter results from the brand's account, heavily branded image results, and then a product refinement carousel (which leads to more Google searches).

There are only five total, traditional organic results on this page, and they're made up of the company's website, the company's Facebook page, the company's YouTube channel, a Wikipedia page about the company, and a news article about the company's 2018 bankruptcy filing.

This isn't just about vertical position — unless you're Mattress Firm, trying to compete on this search really doesn't make much sense. They essentially own page one, and this is a situation we're seeing more and more frequently for searches with clear dominant intent (i.e. most searchers are looking for a specific entity).

What's a search marketer to do?

Search is changing, and change can certainly be scary. There's no question that the SERP of 2020 is very different in some ways than the SERP of 2013, and traditional organic results are just one piece of a much larger picture. Realistically, as search marketers, we have to adapt — either that, or find a new career. I hear alpaca farming is nice.

I think there are three critical things to remember. First, the lion's share of search traffic still comes from traditional organic results. Second, many rich features are really the evolution of vertical results, like news, videos, and images, that still have an organic component. In other words, these are results that we can potentially create content for and rank in, even if they're not the ten blue links we traditionally think of as organic search.

Finally, it's important to realize that many SERP features are driven by searcher intent and we need to target intent more strategically. Take the branded example above — it may be depressing that the #2 organic result is pushed down so far, but ask yourself a simple question. What's the value of ranking for "mattress firm" if you're not Mattress Firm? Even if you're a direct competitor, you're flying in the face of searchers with a very clear brand intent. Your effort is better spent on product searches, consumer questions, and other searches likely to support your own brand and sales.

If you're the 11th person in line at the grocery checkout and the line next to you has no people, do you stand around complaining about how person #2, #7, and #9 aren't as deserving of groceries as you are? No, you change lines. If you're being pushed too far down the results, maybe it's time to seek out different results where your goals and searcher goals are better aligned.

Brief notes on methodology

Not to get too deep in the weeds, but a couple of notes on our methodology. These results were based on a fixed set of 10,000 keywords that we track daily as part of the MozCast research project. All of the data in this study is based on page-one, Google.com, US, desktop results. While the keywords in this data set are distributed across a wide range of topics and industries, the set skews toward more competitive "head" terms. All of the data and images in this post were captured on February 12, 2020. Ironically, this blog post is over 26,000 pixels long. If you're still reading, thank you, and may God have mercy on your soul.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

terça-feira, 25 de fevereiro de 2020

Are H1 Tags Necessary for Ranking? [SEO Experiment]

Posted by Cyrus-Shepard

In earlier days of search marketing, SEOs often heard the same two best practices repeated so many times it became implanted in our brains:

  1. Wrap the title of your page in H1 tags
  2. Use one — and only one — H1 tag per page

These suggestions appeared in audits, SEO tools, and was the source of constant head shaking. Conversations would go like this:

"Silly CNN. The headline on that page is an H2. That's not right!"
"Sure, but is it hurting them?"
"No idea, actually."

Over time, SEOs started to abandon these ideas, and the strict concept of using a single H1 was replaced by "large text near the top of the page."

Google grew better at content analysis and understanding how the pieces of the page fit together. Given how often publishers make mistakes with HTML markup, it makes sense that they would try to figure it out for themselves.

The question comes up so often, Google's John Muller addressed it in a Webmaster Hangout:

"You can use H1 tags as often as you want on a page. There's no limit — neither upper nor lower bound.
H1 elements are a great way to give more structure to a page so that users and search engines can understand which parts of a page are kind of under different headings, so I would use them in the proper way on a page.
And especially with HTML5, having multiple H1 elements on a page is completely normal and kind of expected. So it's not something that you need to worry about. And some SEO tools flag this as an issue and say like 'oh you don't have any H1 tag' or 'you have two H1 tags.' From our point of view, that's not a critical issue. From a usability point of view, maybe it makes sense to improve that. So, it's not that I would completely ignore those suggestions, but I wouldn't see it as a critical issue.
Your site can do perfectly fine with no H1 tags or with five H1 tags."

Despite these assertions from one of Google's most trusted authorities, many SEOs remained skeptical, wanting to "trust but verify" instead.

So of course, we decided to test it... with science!

Craig Bradford of Distilled noticed that the Moz Blog — this very one — used H2s for headlines instead of H1s (a quirk of our CMS).

H2 Header
h1 SEO Test Experiment

We devised a 50/50 split test of our titles using the newly branded SearchPilot (formerly DistilledODN). Half of our blog titles would be changed to H1s, and half kept as H2. We would then measure any difference in organic traffic between the two groups.

After eight weeks, the results were in:

To the uninitiated, these charts can be a little hard to decipher. Rida Abidi of Distilled broke down the data for us like this:

Change breakdown - inconclusive
  • Predicted uplift: 6.2% (est. 6,200 monthly organic sessions)
  • We are 95% confident that the monthly increase in organic sessions is between:
    • Top: 13,800
    • Bottom: -4,100
The results of this test were inconclusive in terms of organic traffic, therefore we recommend rolling it back.

Result: Changing our H2s to H1s made no statistically significant difference

Confirming their statements, Google's algorithms didn't seem to care if we used H1s or H2s for our titles. Presumably, we'd see the same result if we used H3s, H4s, or no heading tags at all.

It should be noted that our titles still:

  • Used a large font
  • Sat at the top of each article
  • Were unambiguous and likely easy for Google to figure out

Does this settle the debate? Should SEOs throw caution to the wind and throw away all those H1 recommendations?

No, not completely...

Why you should still use H1s

Despite the fact that Google seems to be able to figure out the vast majority of titles one way or another, there are several good reasons to keep using H1s as an SEO best practice.

Georgy Nguyen made some excellent points in an article over at Search Engine Land, which I'll try to summarize and add to here.

1. H1s help accessibility

Screen reading technology can use H1s to help users navigate your content, both in display and the ability to search.

2. Google may use H1s in place of title tags

In some rare instances — such as when Google can't find or process your title tag — they may choose to extract a title from some other element of your page. Oftentimes, this can be an H1.

3. Heading use is correlated with higher rankings

Nearly every SEO correlation study we've ever seen has shown a small but positive correlation between higher rankings and the use of headings on a page, such as this most recent one from SEMrush, which looked at H2s and H3s.

To be clear, there's no evidence that headings in and of themselves are a Google ranking factor. But headings, like Structured Data, can provide context and meaning to a page.

As John Mueller said on Twitter:

What's it all mean? While it's a good idea to keep adhering to H1 "best practices" for a number of reasons, Google will more than likely figure things out — as our experiment showed — if you fail to follow strict H1 guidelines.

Regardless, you should likely:

  1. Organize your content with hierarchical headings — ideally H1, H2s, H3s, etc.
  2. Use a large font headline at the top of your content. In other words, make it easy for Google, screen readers, and other machines or people reading your content to figure out the headline.
  3. If you have a CMS or technical limitations that prevent you from using strict H1s and SEO best practices, do your best and don't sweat the small stuff.

Real-world SEO — for better or worse — can be messy. Fortunately, it can also be flexible.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

segunda-feira, 24 de fevereiro de 2020

Spot Zero is Gone — Here's What We Know After 30 Days

Posted by PJ_Howland

As you are probably aware by now, recent updates have changed the world of search optimization. On January 22nd Google, in its infinite wisdom, decided that the URL that has earned the featured snippet in a SERP would not have the additional spot in that SERP. This also means that from now on the featured snippet will be the true spot-one position.

Rather than rehash what’s been so eloquently discussed already, I’ll direct you to Dr. Pete’s post if you need a refresher on what this means for you and for Moz.

30 days is enough to call out trends, not all of the answers

I’ve been in SEO long enough to know that when there’s a massive shake-up (like the removal of spot zero), bosses and clients want to know what that means for the business. In situations like this, SEOs responses are limited to 1) what they can see in their own accounts, and 2) what others are reporting online.

A single 30-day period isn’t enough time to observe concrete trends and provide definitive suggestions for what every SEO should do. But it is enough time to give voice to the breakout trends that are worth observing as time goes on. The only way for SEOs to come out on top is by sharing the trends they are seeing with each other. Without each other’s data and theories, we’ll all be left to see only what’s right in front of us — which is often not the entire picture.

So in an effort to further the discussion on the post-spot-zero world, we at 97th Floor set out to uncover the trends under our nose, by looking at nearly 3,000 before-and-after examples of featured snippets since January 22nd.

The data and methodology

I know we all want to just see the insights (which you’re welcome to skip to anyway), but it's worth spending a minute explaining the loose methodology that yielded the findings.

The two major tools used here were Google Search Console and STAT. While there’s more traffic data in Google Analytics than GSC, we’re limited in seeing the traffic driven by actual keywords, being limited by page-wide traffic. For this reason, we used GSC to get the click-through rates of specific keywords on specific pages. This pairs nicely with STAT's data to give us a daily pinpoint of both Google Rank and Google Base Rank for the keywords at hand.

While there are loads of keywords to look at, we found that small-volume keywords — anything under 5,000 global MSV (with some minor exceptions) — produced findings that didn’t have enough data behind them to claim statistical significance. So, all of the keywords analyzed had over 5,000 global monthly searches, as reported by STAT.

It’s also important to note that all the difficulty scores come from Moz.

Obviously we were only interested in SERPs that had an existing featured snippet serving to ensure we had an accurate before-and-after picture, which narrows down the number of keywords again. When all was said and done, the final batch of keywords analyzed was 2,773.

We applied basic formulas to determine which keywords were telling clear stories. That led us to intimately analyze about 100 keywords by hand, sometimes multiple hours looking at a single keyword, or rather a single SERP over a 30-day period. The findings reported below come from these 100 qualitative keyword analyses.

Oh, and this may go without saying, but I’m doing my best to protect 97th Floor’s client’s data, so I won’t be giving anything incriminating away as to which websites my screenshots are attached to. 97th Floor has access to hundreds of client GSC accounts and we track keywords in STAT for nearly every one of them.

Put plainly, I’m dedicated to sharing the best data and insight, but not at the expense of our clients’ privacy.

The findings... not what I expected

Yes, I was among the list of SEOs that said for the first time ever SEOs might actually need to consider shooting for spot 2 instead of spot 1.

I still don’t think I was wrong (as the data below shows), but after this data analysis I’ve come to find that it’s a more nuanced story than the quick and dirty results we all want from a study like this.

The best way to unfold the mystery from the spot-zero demotion is to call out the individual findings from this study as individual lessons learned. So, in no particular order, here’s the findings.

Longtime snippet winners are seeing CTR and traffic drops

While the post-spot-zero world may seem exciting for SEOs that have been gunning for a high-volume snippet spot for years, the websites who have held powerful snippet positions indefinitely are seeing fewer clicks.

The keyword below represents a page we built years ago for a client that has held the snippet almost exclusively since launch. The keyword has a global search volume of 74,000 and a difficulty of 58, not to mention an average CPC of $38.25. Suffice it to say that this is quite a lucrative keyword and position for our client.

We parsed out the CTR of this single keyword directing to this single page on Google Search Console for two weeks prior to the January 22d announcement and two weeks following it. I’d love to go back farther than two weeks, but if we did, we would have crept into New Years traffic numbers, which would have muddled the data.

As you can see, the impressions and average position remained nearly identical for these two periods. But CTR and subsequent clicks decreased dramatically in the two weeks immediately following the January 22nd spot-zero termination.

If this trend continues for the rest of 2020, this single keyword snippet changeup will result in a drop of 9,880 clicks in 2020. Again, that’s just a single keyword, not all of the keywords this page represents. When you incorporate average CPC into this equation that amounts to $377,910 in lost clicks (if those were paid clicks).

Sure, this is an exaggerated situation due to the volume of the keyword and inflated CPC, but the principle uncovered over and over in this research remains the same: Brands that have held the featured snippet position for long periods of time are seeing lower CTRs and traffic as a direct result of the spot-zero shakeup.

When a double snippet is present, CTR on the first snippet tanks

Nearly as elusive as the yeti or Bigfoot, the double snippet found in its natural habitat is rare.

Sure this might be expected; when there are two results that are both featured snippets, the first one gets fewer clicks. But the raw numbers left us with our jaws on the floor. In every instance we encountered this phenomenon we discovered that spot one (the #1 featured snippet) loses more than 50% of its CTR when the second snippet is introduced.

This 40,500 global MSV keyword was the sole featured snippet controller on Monday, and on Tuesday the SERP remained untouched (aside from the second snippet being introduced).

This small change brought our client’s CTR to its knees from a respectable 9.2% to a crippling 2.9%.

When you look at how this keyword performed the rest of the week, the trend continues to follow suit.

Monday and Wednesday are single snippet days, while Tuesday, Thursday, and Friday brought the double snippet.

Easy come, easy go (not a true Spot 1)

There’s been a great deal of speculation on this fact, but now I can confirm that ranking for a featured snippet doesn’t come the same way as ranking for a true spot 1. In the case below, you can see a client of ours dancing around spots 5 and 6 before taking a snippet. Similarly when they lose the snippet, they fall back to the original position.

Situations like this were all too common. Most of the time we see URLs losing the snippet to other URLs. Other times, Google removes the snippet entirely only to bring it back the following day.

If you’re wondering what the CTR reporting on GSC was for the above screenshot, I’ve attached that below. But don’t geek out too quickly; the findings aren’t terribly insightful. Which is insightful in itself.

This keyword has 22,200 global volume and a keyword difficulty of 44. The SERP gets significant traffic, so you would think that findings would be more obvious.

If there’s something to take away from situations like this, here it is: Earning the snippet doesn’t inherently mean CTRs will improve beyond what you would be getting in a below-the-fold position.

Seeing CTR bumps below the fold

Much of the data addressed to this point either speaks of sites that either have featured snippets or lost them, but what about the sites that haven’t had a snippet before or after this shakeup?

If that describes your situation, you can throw yourself a tiny celebration (emphasis on the tiny), because the data is suggesting that your URLs could be getting a slight CTR bump.

The example below shows a 74,000 global MSV keyword with a difficulty that has hovered between spots 5 and 7 for the week preceding and the week following January 22nd.

The screenshot from STAT shows that this keyword has clearly remained below the fold and fairly consistent. If anything, it ranked worse after January 22nd.

The click-through rate improved the week following January 22nd from 3% to 3.7%. Perhaps not enough to warrant any celebration for those that are below the fold, as this small increase was typical across many mid-first-page positions.

“People Also Ask” boxes are here to steal your snippet CTR

Perhaps this information isn’t new when considering the fact that PAA boxes are just one more place that can lead users down a rabbit hole of information that isn’t about your URL.

On virtually every single SERP (in fact, we didn’t find an instance where this wasn’t true), the presence of a PAA box drops the CTR of both the snippet and the standard results.

The negative effects of the PAA box appearing in your SERP are mitigated when the PAA box doesn’t serve immediately below the featured snippet. It’s rare, but there are situations where the “People Also Ask” box serves lower in the SERP, like this example below.

If your takeaway here is to create more pages that answer questions showing up in relevant PAA boxes, take a moment to digest the fact that we rarely saw instances of clicks when our clients showed up in PAA boxes.

In this case, we have a client that ranks for two out of the first four answers in a high-volume SERP (22,000 global monthly searches), but didn’t see a single click — at least none to speak of from GSC:

While its counterpart page, which served in spot 6 consistently, is at least getting some kind of click-through rate:

If there’s a lesson to be learned here, it’s that ranking below the fold on page one is better than getting into the PAA box (in the terms of clicks anyway).

So, what is the takeaway?

As you can tell, the findings are a bit all over the place. However, the main takeaway that I keep coming back to is this: Clickability matters more than it ever has.

As I was crunching this data, I was constantly reminded of a phrase our EVP of Operations, Paxton Gray, is famous for saying:

“Know your SERPs.”

This stands truer today than it did in 2014 when I first heard him say it.

As I reflected on this pool of frustrating data, I was reminded of Jeff Bezo’s remarks in his 2017 Amazon Shareholder’s letter:

“One thing I love about customers is that they are divinely discontent. Their expectations are never static — they go up. It’s human nature. We didn’t ascend from our hunter-gatherer days by being satisfied. People have a voracious appetite for a better way, and yesterday’s ‘wow’ quickly becomes today’s ‘ordinary’.”

And then it hit me: Google wasn’t built for SEOs; it’s built for users. Google’s job is our job, giving the users the best content. At 97th Floor our credo is: we make the internet a better place. Sounds a little corny, but we stand by it. Every page we build, every ad we run, every interactive we build, and every PDF we publish for our clients needs to make the internet a better place. And while it’s challenging for us watching Google’s updates take clicks from our clients, we recognize that it’s for the user. This is just one more step in the elegant dance we perform with Google.

I remember a day when spots 1, 2, and 3 were consistently getting CTRs in the double digits. And today, we celebrate if we can get spot 1 over 10% CTR. Heck, I‘ll even take an 8% for a featured snippet after running this research!

SEO today is more than just putting your keyword in a title and pushing some links to a page. SERP features can have a more direct effect on your clicks than your own page optimizations. But that doesn’t mean SEO is out of our control — not by a long shot. SEOs will pull through, we always do, but we need to share our learnings with each other. Transparency makes the internet a better place after all.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

sexta-feira, 21 de fevereiro de 2020

Which of My Competitor's Keywords Should (& Shouldn't) I Target? - Best of Whiteboard Friday

Posted by randfish

You don't want to try to rank for every one of your competitors' keywords. Like most things with SEO, it's important to be strategic and intentional with your decisions. In this fan favorite Whiteboard Friday, Rand shares his recommended process for understanding your funnel, identifying the right competitors to track, and prioritizing which of their keywords you ought to target.

Plus, don't miss our upcoming webinar on Wednesday, March 11: Competitive Analysis for SEO: Size Up & Surpass Your Search Rivals presented by Director of Growth Marketing Kelly Cooper.

Which of my competitor's keyword should I target?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. So this week we're chatting about your competitors' keywords and which of those competitive keywords you might want to actually target versus not.

Many folks use tools, like SEMrush and Ahrefs and KeywordSpy and Spyfu and Moz's Keyword Explorer, which now has this feature too, where they look at: What are the keywords that my competitors rank for, that I may be interested in? This is actually a pretty smart way to do keyword research. Not the only way, but a smart way to do it. But the challenge comes in when you start looking at your competitors' keywords and then realizing actually which of these should I go after and in what priority order. In the world of competitive keywords, there's actually a little bit of a difference between classic keyword research.

So here I've plugged in Hammer and Heels, which is a small, online furniture store that has some cool designer furniture, and Dania Furniture, which is a competitor of theirs — they're local in the Seattle area, but carry sort of modern, Scandinavian furniture — and IndustrialHome.com, similar space. So all three of these in a similar space, and you can see sort of keywords that return that several of these, one or more of these rank for. I put together difficulty, volume, and organic click-through rate, which are some of the metrics that you'll find. You'll find these metrics actually in most of the tools that I just mentioned.

Process:

So when I'm looking at this list, which ones do I want to actually go after and not, and how do I choose? Well, this is the process I would recommend.

I. Try and make sure you first understand your keyword to conversion funnel.

So if you've got a classic sort of funnel, you have people buying down here — this is a purchase — and you have people who search for particular keywords up here, and if you understand which people you lose and which people actually make it through the buying process, that's going to be very helpful in knowing which of these terms and phrases and which types of these terms and phrases to actually go after, because in general, when you're prioritizing competitive keywords, you probably don't want to be going after these keywords that send traffic but don't turn into conversions, unless that's actually your goal. If your goal is raw traffic only, maybe because you serve advertising or other things, or because you know that you can capture a lot of folks very well through retargeting, for example maybe Hammer and Heels says, "Hey, the biggest traffic funnel we can get because we know, with our retargeting campaigns, even if a keyword brings us someone who doesn't convert, we can convert them later very successfully," fine. Go ahead.

II. Choose competitors that tend to target the same audience(s).

So the people you plug in here should tend to be competitors that tend to target the same audiences. Otherwise, your relevance and your conversion get really hard. For example, I could have used West Elm, which does generally modern furniture as well, but they're very, very broad. They target just about everyone. I could have done Ethan Allen, which is sort of a very classic, old-school furniture maker. Probably a really different audience than these three websites. I could have done IKEA, which is sort of a low market brand for everybody. Again, not kind of the match. So when you are targeting conversion heavy, assuming that these folks were going after mostly conversion focused or retargeting focused rather than raw traffic, my suggestion would be strongly to go after sites with the same audience as you.

If you're having trouble figuring out who those people are, one suggestion is to check out a tool called SimilarWeb. It's expensive, but very powerful. You can plug in a domain and see what other domains people are likely to visit in that same space and what has audience overlap.

III. The keyword selection process should follow some of these rules:

A. Are easiest first.

So I would go after the ones that tend to be, that I think are going to be most likely for me to be able to rank for easiest. Why do I recommend that? Because it's tough in SEO with a lot of campaigns to get budget and buy-in unless you can show progress early. So any time you can choose the easiest ones first, you're going to be more successful. That's low difficulty, high odds of success, high odds that you actually have the team needed to make the content necessary to rank. I wouldn't go after competitive brands here.

B. Are similar to keywords you target that convert well now.

So if you understand this funnel well, you can use your AdWords campaign particularly well for this. So you look at your paid keywords and which ones send you highly converting traffic, boom. If you see that lighting is really successful for our furniture brand, "Oh, well look, glass globe chandelier, that's got some nice volume. Let's go after that because lighting already works for us."

Of course, you want ones that fit your existing site structure. So if you say, "Oh, we're going to have to make a blog for this, oh we need a news section, oh we need a different type of UI or UX experience before we can successfully target the content for this keyword," I'd push that down a little further.

C. High volume, low difficulty, high organic click-through rate, or SERP features you can reach.

So basically, when you look at difficulty, that's telling you how hard is it for me to rank for this potential keyword. If I look in here and I see some 50 and 60s, but I actually see a good number in the 30s and 40s, I would think that glass globe chandelier, S-shaped couch, industrial home furniture, these are pretty approachable. That's impressive stuff.

Volume, I want as high as I can get, but oftentimes high volume leads to very high difficulty.
Organic click-through rate percentage, this is essentially saying what percent of people click on the 10 blue link style, organic search results. Classic SEO will help get me there. However, if you see low numbers, like a 55% for this type of chair, you might take a look at those search results and see that a lot of images are taking up the other organic click-through, and you might say, "Hey, let's go after image SEO as well." So it's not just organic click-through rate. You can also target SERP features.

D. Are brands you carry/serve, generally not competitor's brand names.

Then last, but not least, I would urge you to go after brands when you carry and serve them, but not when you don't. So if this Ekornes chair is something that your furniture store, that Hammers and Heels actually carries, great. But if it's something that's exclusive to Dania, I wouldn't go after it. I would generally not go after competitors' brand names or branded product names with an exception, and I actually used this site to highlight this. Industrial Home Furniture is both a branded term, because it's the name of this website — Industrial Home Furniture is their brand — and it's also a generic. So in those cases, I would tell you, yes, it probably makes sense to go after a category like that.

If you follow these rules, you can generally use competitive intel on keywords to build up a really nice portfolio of targetable, high potential keywords that can bring you some serious SEO returns.

Look forward to your comments and we'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


For more on competitor analysis, join our upcoming webinar on Wednesday March 11 at 10am PST: Competitive Analysis for SEO: Size Up & Surpass Your Search Rivals, hosted by Moz's Director of Growth Marketing Kelly Cooper:

Save my spot!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

segunda-feira, 17 de fevereiro de 2020

Google My Business: FAQ for Multiple Businesses at the Same Address

Posted by MiriamEllis

How should I get listed in Google My Business if I’ve got multiple businesses at the same address? How many listings am I eligible for if I’m legitimately running more than one business at my location? What determines eligibility, and what penalties might I incur if I make a mistake? How should I name my businesses at the same address?

The FAQs surrounding this single, big topic fill local SEO forums across the web, year after year.

The guidelines for representing your business on Google contain most of the answers you’re seeking about co-located businesses, but sometimes they can err on the side of too little detail, leading to confusion.

Today, Iet's quickly tackle the commonest FAQs that local business owners and marketers raise related to this scenario, and if you have further questions, please ask in the comments!

Q: I have more than one business at the same address. Can I have more than one Google My Business listing?

A: If you are legitimately operating multiple, legally distinct businesses, you can typically create a Google My Business listing for each of them. It’s not at all uncommon for more than one business to be located at a shared address. However, keep reading for further details and provisos.

Q: How do I know if my multiple businesses at the same address are distinct enough to be eligible for separate Google My Business listings?

A: If each brick-and-mortar business you operate is separately registered with appropriate state and federal agencies, has a unique TAX ID with which you file separate taxes, meets face-to-face with customers, and has a unique phone number, then it’s typically eligible for a distinct GMB listing. However, keep reading for more information.

Q: Can service area businesses list multiple businesses at the same address?

A: Google has historically treated SABs differently than brick-and-mortar businesses. While no official guideline forbids listing multiple SABs — like plumbers and locksmiths — at the same location, it’s not considered an industry best practice to do so. Google appears to be more active in issuing hard suspensions to SABs in this scenario, even if the businesses are legitimate and distinct. Because of this, it’s better strategy not to co-locate SABs.

Q: What would make me ineligible for more than one Google My Business listing at the same address?

A: If your businesses aren’t registered as legally distinct entities or if you lack unique phone numbers for them, you are ineligible to list them separately. Also, if your businesses are simply representative of different product lines or services you offer under the umbrella of a single business — like a handyman who repairs both water heaters and air conditioners — they aren’t eligible for separate listings. Additionally, do not list multiple businesses at PO boxes, virtual offices, mailboxes at remote locations, or at locations you don’t have the authority to represent.

Q: Will I be penalized if I list multiple ineligible businesses at the same address?

A: Yes, you could be. Google could issue a hard suspension on one or more of your ineligible listings at any time. A hard suspension means that Google has removed your listing and its associated reviews.

Q: Will suite numbers help me convince Google I actually have two locations so that I can have more than one GMB listing?

A: No. Google doesn’t pay attention to suite numbers, whether legitimate or created fictitiously. Don’t waste time attempting to make a single location appear like multiple locations by assigning different suite numbers to the entities in hopes of qualifying for multiple listings.

Q: Can I list my business at a co-working space, even though there are multiple businesses at the same address?

A: If your business has a unique, direct phone number answered by you and you are staffing the co-working space with your own staff at your listed hours, yes, you are typically eligible for a Google My Business listing. However, if any of the other businesses at the location share your categories or are competing for the same search terms, it is likely that you or your competitors will be filtered out of Google’s mapping product due to the shared elements.

Q: How many GMB listings can I have if there are multiple seasonal businesses at my address?

A: If your property hosts an organic fruit stand in summer and a Christmas tree farm in the winter, you need to closely follow Google’s requirements for seasonal businesses. In order for each entity to qualify for a listing, it must have year-round signage and set and then remove its GMB hours at the opening and closing of its season. Each entity should have a distinct name, phone number and Google categories.

Q: How should I name my multiple businesses at the same address?

A: To decrease the risk of filtering or penalties, co-located businesses must pay meticulous attention to allowed naming conventions. Questions surrounding this typically fall into five categories:

  1. If one business is contained inside another, as in the case of a McDonald’s inside a Walmart, the Google My Business names should be “McDonald’s” and “Walmart” not “McDonalds in Walmart”.
  2. If co-located brands like a Taco Bell and a Dunkin’ Donuts share the same location, they should not combine their brand names for the listing. They should either create a single listing with just one of the brand names, or, if the brands operate independently, a unique listing for each separate brand.
  3. If multiple listings actually reflect eligible departments within a business — like the sales and parts departments of a Chevrolet dealership — then it’s correct to name the listings Chevrolet Sales Department and Chevrolet Parts Department. No penalties should result from the shared branding elements, so long as the different departments have some distinct words in their names, distinct phone numbers and distinct GMB categories.
  4. If a brand sells another brand’s products — like Big-O selling Firestone Tires — don’t include the branding of the product being sold in the GMB business name. However, Google stipulates that if the business location is an authorized and fully dedicated seller of the branded product or service (sometimes known as a "franchisee"), you may use the underlying brand name when creating the listing, such as "TCC Verizon Wireless Premium Retailer.”
  5. If an owner is starting out with several new businesses at the same location, it would be a best practice to keep their names distinct. For example, a person operating a pottery studio and a pet grooming station out of the same building can lessen the chance of filters, penalties, and other problems by avoiding naming conventions like “Rainbow Pottery” and “Rainbow Pet Grooming” at the same location.

Q: Can I create separate listings for classes, meetings, or events that share a location?

A: Unfortunately the guidelines on this topic lack definition. Google says not to create such listings for any location you don’t own or have the authority to represent. But even if you do own the building, the guidelines can lead to confusion. For example, a college can create separate listings for different departments on campus, but should not create a listing for every class being offered, even if the owners of the college do have authority to represent it.

Another example would be a yoga instructor who teaches at three different locations. If the building owners give them permission to list themselves at the locations, along with other instructors, the guidelines appear to permit creating multiple listings of this kind. However, such activity could end up being perceived as spam, could be filtered out because of shared elements with other yoga classes at a location, and could end up competing with the building’s own listing.

Because the guidelines are not terribly clear, there is some leeway in this regard. Use your discretion in creating such listings and view them as experimental in case Google should remove them at some point.

Q: How do I set GMB hours for co-located business features that serve different functions?

A: A limited number of business models have to worry about this issue of having two sets of hours for specific features of a business that exist on the same premises but serve unique purposes. For example, a gas station can have a convenience market that is open 6 AM to 10 PM, but pumps that operate 24 hours a day. Google sums up the shortlist for such scenarios this way, which I’ll quote verbatim:

  • Banks: Use lobby hours if possible. Otherwise, use drive-through hours. An ATM attached to a bank can use its own separate listing with its own, different hours.
  • Car dealerships: Use car sales hours. If hours for new car sales and pre-owned car sales differ, use the new sales hours.
  • Gas stations: Use the hours for your gas pumps.
  • Restaurants: Use the hours when diners can sit down and dine in your restaurant. Otherwise, use takeout hours. If neither of those is possible, use drive-through hours, or, as a last resort, delivery hours.
  • Storage facilities: Use office hours. Otherwise, use front gate hours.

Q: Could the details of my Google listing get mixed up with another business at my location?

A: Not long ago, local SEO blogs frequently documented cases of listing “conflation”. Details like similar or shared names, addresses or phone numbers could cause Google to merge two listings together, resulting in strange outcomes like the reviews for one company appearing on the listing of another. This buggy mayhem, thankfully, has died down to the extent that I haven’t seen a report of listing conflation in some years. However, it’s good to remember that errors like these made it clear that each business you operate should always have its own phone number, naming should be as unique as possible, and categories should always be carefully evaluated.

Q: Why is only one of my multiple businesses at the same location ranking in Google’s local results?

A: The commonest cause of this is that Google is filtering out all but one of your businesses from ranking because of listing element similarity. If you attempt to create multiple listings for businesses that share Google categories or are competing for the same keyword phrases at the same address, Google’s filters will typically make all but one of the entities invisible at the automatic zoom level of their mapping product. For this reason, creating multiple GMB listings for businesses that share categories or industries is not a best practice and should be avoided.

Q: My GMB listing is being filtered due to co-location. What should I do?

A: This topic has come to the fore especially since Google’s rollout of the Possum filter on Sept 1, 2016. Businesses at the same address (or even in the same neighborhood) that share a category and are competing for the same search phrases often have the disappointment of discovering that their GMB listing appears to be missing from the map while a co-located or nearby competitor ranks highly. Google’s effort to deliver diversity causes them to filter out companies that they deem too similar when they’re in close proximity to one another.

If you find yourself currently in a scenario where you happen to be sharing a building with a competitor, and you’ve been puzzled as to why you seem invisible on Google’s maps, zoom in on the map and see if your listing suddenly appears. If it does, chances are, you’re experiencing filtering.

If this is your predicament, you have a few options for addressing it. As a measure of last resort, you could relocate your company to a part of town where you don’t have to share a location and have no nearby competitors, but this would be an extreme solution. More practically speaking, you will need to audit your competitor, comparing their metrics to yours to discover why Google sees them as the stronger search result. From the results of your audit, you can create a strategy for surpassing your opponent so that Google decides it’s your business that deserves not to be filtered out.

Summing Up

There’s nothing wrong with multiple businesses sharing an address. Google’s local index is filled with businesses in this exact situation ranking just fine without fear of penalization. But the key to success and safety in this scenario is definitely in the details.

Assessing eligibility, accurately and honestly representing your brand, adhering to guidelines and best practices, and working hard to beat the filters will stand you in good stead.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

sexta-feira, 14 de fevereiro de 2020

How to Get Your Web Developer on Board with SEO [Bonus PDF] - Whiteboard Friday

Posted by BritneyMuller

You've figured out what's wrong, and you've delivered a laundry list of demands to your web dev team: re-index these pages, fix this duplicate content, redirect these URLs... but how often are those fixes prioritized, and how much time do you invest in pushing to get them there?

Cultivating a positive, productive relationship with your web developers is one of the smartest (and most empathetic) things you can do as an SEO. After all, they're your other half, the key to getting your work done quickly and well. In this Whiteboard Friday, Britney Muller shares six essential ways to get your web dev on board with SEO, from working to better understand their role and offer help when you can, to sharing your wins and asking for feedback on working together.

And don't miss the end — we've released our brand-new Web Developer's SEO Cheat Sheet for 2020, the perfect pairing for today's video! 

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome to another edition of Whiteboard Friday. Today we're talking all about how to get your web dev on board with SEO. So really excited. I think you'll notice my biggest point here, and I couldn't feel more strongly about the fact that we really do have so much to learn from developers, it's wild.

Hopefully, this video helps kind of open some of your minds or expand some of the ways in which you can do that, because it will make you a lot stronger. 

1. Create a genuine relationship with developers and work to better understand their role

First and foremost, create a genuine relationship with the developers you work with. Better understand their role and how they're measuring their own success. Know what languages they program in. Better understanding their perspective and their opinion on things helps you create a better working relationship. Part of that is cultivating trust. 

One of the ways in which I've found some success cultivating that trust is just admitting when I have no idea how to implement a particular SEO fix. Or even when I think that I do, I prefer to ask, "What is the best way you see this being implemented? How would you most efficiently implement this change or optimization?" More frequently than not, they will have a way better way to make these website changes because they have that backend knowledge of the website.

Being humble, expressing that you don't know everything, you're not trying to step in and tell them how to program pages or how to fix that, it should be way more of a communication and a transaction of just information from both sides. 

2. Learning from developers helps you become a stronger SEO

I promise you. It is one of my most favorite things working here at Moz. I have learned so much from the developers here. But likewise, some of the developers have learned things from myself and other fellow SEOs that work here. This is a symbiotic working relationship, where developers want to program sites and pages that do well in search.

I think it's part of your job to express and communicate the potential value of a well-crafted web page. Show them how much more powerful their code and their work can be if set up properly or set up with thinking about JavaScript and Google crawling different aspects of it.

That's what makes it a really efficient working relationship. Be open to just learning new things from your developers. 

3. Be a champion of the developers you work with

Understand what it is they're trying to accomplish. If there's any way you can help support that or consider that in your work and the things you're making progress on, it's a win-win.

4. Create a workflow/process to keep an eye on dev changes and catch things early

This is a common problem. Something that a lot of people ask about is creating this workflow or process in which you can keep an eye on dev changes. For really large websites, this can get unwieldy. It can be difficult to keep an eye on changes that might affect SEO.

But if you can have that conversation with a developer or a team of developers that you work with on: What's the best way to manage this? Can you add me to GitHub so I can look at things that are getting pushed? Whatever that might be, it can really help create the space where you're doing preventative SEO. You're making sure that nothing goes terribly wrong in the future and makes it more manageable in the long run. 

5. Share your SEO wins with your developers — and thank them!

Share your SEO wins with the developers, especially when they've helped you and maybe have provided better solutions. You should absolutely thank them, and what a great opportunity to sort of share in those wins and continue to grow that working relationship.

6. Ask for feedback

Lastly, ask for feedback. If you feel like you're struggling to communicate with a group of developers or a single developer, just be honest and use some radical candor and ask, "How can I better work with you? How can I better support you?" Opening that up for some feedback can also help to strengthen the relationship. 

Bonus: The brand-new Web Developer's SEO Cheat Sheet

Then the one last thing that I hope you can really leverage as a tool to grow in your SEO efforts and to help you get more things done with the development team is The Web Developer's Cheat Sheet for SEO.

This is a great resource to open up this conversation with developers, to sit down and have a conversation about why some of these things are important to you as an SEO and what comes to mind when they look at it. They have a totally different perspective on a lot of the things within this sheet.

Download the free Cheat Sheet

It's a great opportunity for you to sit down and have those conversations and be able to excel in your SEO efforts. I hope some of this helps. I think it's one of the most important things in getting SEO work done and seeing that success.

Please let me know what you think down below in the comments. I look forward to hearing your thoughts on this, what's worked for you, what hasn't worked for you, and what other questions you have. I will see you all again soon. Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

segunda-feira, 10 de fevereiro de 2020

The Power of "Is": A Featured Snippet Case Study

Posted by EricSerdar

I’m not a literary scholar, but I believe it was Hamlet that said “to have a featured snippet or not to have a featured snippet?” Ever since featured snippets came onto the scene, sites have been trying to secure them.

My team and I wanted in on this craze. Throughout our journey of research, testing, failure, and success, we found some interesting pieces of information that we wanted to share with the community. I’ll walk you through what we did and show you some of our results (though can’t share traffic numbers).

It was Britney Muller’s webinar on Feature Snippet Essentials and the release of the featured snippets cheat sheet that inspired me to capture what we've learned.

What are featured snippets?

A featured snippet is the box that appears at the top of the search result page that provides information to succinctly and accurately answer your query and cites a website.

Why are featured snippets important?

A featured snippet is important because it represents an additional SERP feature that you can secure. Usually located at the very top of the results page, featured snippets offer you greater visibility to searchers and can boost brand recognition.

Our featured snippet plan of attack

  1. Research, research, and more research on how to pull this off
  2. Identify keywords we wanted to target
  3. Change how we structured our on-page content
  4. Measure, test, and repeat the process

1. Research, research, and more research

We spent a great deal of time researching featured snippets. We looked at different ways to find featured snippet opportunities and researched how to optimize our content for them. We also went and saw Kellie Gibson speak on featured snippets volatility.

Did we implement everything from what we learned during this discovery phase into our featured snippet strategy? No. Are we perfect at it now after a year and a half of practicing this? No, no, no. We are getting better at it, though.

2. Identify keywords we wanted to target

We originally started out focusing on big “head” keywords. These represented terms that had indeterminate searcher intent. The first head term that we focused on was HRIS. It stands for Human Resources Information System — sexy, right?

Note: Looking back on this, I wish we had focused on longer tail keywords when testing out this strategy. It's possible we could have refined our process faster focusing on long tail keywords instead of the large head terms.

3. Change how we structure our on-page content

We worked closely with our writing team to update how we lay out content on our blog. We changed how we used H2s, H3s (we actually used them now!), lists, and so on to help make our content easier to read for both users and robots.

In most of the content where we’re trying to rank for a featured snippet, we have an H2 in the form of a question. Immediately after the H2, we try and answer that question. We’ve found this to be highly successful (see pictures later on in the post). I wish I could say that we learned this tactic on our first try, but it took several months before this dawned on us.

4. Measure, test, and repeat

The first blog post that we tried this out on was our “What is an HRIS” article. Overall, this post was a success, it ranked for the head term that we were going for (HRIS), but we didn’t win a featured snippet. We deemed it a slight failure and went back to work.

This is where the fun started.

Featured snippet successes

We discovered a featured snippet trigger that we could capitalize on — mainly by accident. What was it?

Is.

Really. That was it. Just by adding that to some of our content, we started to pick up featured snippets. We started to do it more and more, and we were winning more and more featured snippets! I believe it was this strategic HR example that clued us onto the “is” trigger.

So we kept it up.

Featured snippet won for "employee orientation"
Featured snippet won for "hr business partner"
Featured snippet won for "employee development plan"

What did we learn?

I want to preface this by saying that all of this is anecdotal evidence. We haven’t looked at several million URLs, run it through any fancy number-crunching, or had a statistician look at the data. These are just a few examples that we’ve noticed that, when repeated, have worked for us.

  1. Blog/HR glossary - We found that it was easier for us to gain featured snippets from our blog or our glossary pages. It seemed like no matter what optimizations that we made on the product page, we weren’t able to make it happen.
  2. Is - No, not the clown from the Stephen King novel. “Is” seemed to be the big trigger word for winning featured snippets. During our audit, we did find some examples of list featured snippets, but the majority were paragraphs and the trigger word was "is."
  3. Definitions - We saw that definitions of the head term we were trying to go for was usually what got the definition. Our on-page copy would have the H2 with the keyword (e.g. What is Employee Orientation?) and then the paragraph copy would answer that question.
  4. Updating old posts - One surprising thing we learned is that when we went back to old posts and tried adding the “is” trigger word, we didn’t see a change — even if we added a good amount of new content to the page. We were only able to grab featured snippets with new content that we created. Also, when we updated large amounts of content on a few pages that had featured snippets, we lost them. We made sure to not touch the sections of the page that the snippet was pulling from, but we still lost the snippet (some have come back, but some are still gone).

Conclusion

A few final things to note:

  1. First, while these examples are anecdotal, I think that they show some practices that anyone wanting to capture featured snippets can do. 
  2. Second, this was process was over a 12–18 month period and we’re still evolving what we think is the best way for us and our content team. 
  3. Third, we had a lot of failures with this. I showed you one example, but we’ve had many (short-form content, long-form content, glossary terms, blog posts, etc.) that didn’t work. We just kept measuring, testing, and optimizing. 
  4. Lastly, I need to give a shout out to our writing team. We massively disrupted their process with this and they have been phenomenal to work with (effective interdepartmental relationships are crucial for any SEO project).

Let me know what's worked for you or if you have any questions by leaving a comment down below.

Note: On January 23, 2020 Google announced that featured snippets would no longer be listed twice on the first page. For more information, you can check out this thread from Google Search Liaison. This may change how valuable featured snippets are to companies and the amount of clicks a listing gets. Before you start to panic, remember it will be important to watch and measure how this affects your site before doing anything drastic. If you do decide to go nuclear and to remove your featured snippets from the results, check out this documentation.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

sexta-feira, 7 de fevereiro de 2020

7 SEO Processes That Get Easier with Increased PageRank/Domain Authority - Whiteboard Friday

Posted by Cyrus-Shepard

A rising tide lifts all ships — and it's similar story with increased site authority. What factors are affected as you improve PageRank or Domain Authority, and how? In today's Whiteboard Friday, Cyrus details seven SEO processes that are made easier by a strong investment in link building and growing your authority.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. Welcome to another edition of Whiteboard Friday. I'm Cyrus Shepard. Quick Whiteboard this week. I want to talk about links.

We know in SEO we love links. Everybody wants links. But why? What do links do for you? They do a surprising amount for you that we sometimes don't realize. So the title of today's Whiteboard, "7 SEO Processes That Get Easier with Increased PageRank and Domain Authority." So why did we choose PageRank and Domain Authority?

Well, these are both algorithms that measure link power, both the number of links and the quantity of links. PageRank being Google's algorithm to rank web pages based on popularity and importance. Domain Authority, which Google doesn't use, just to be clear, Domain Authority being a Moz algorithm that measures both link quantity and quality.

For our purposes, we can basically use them in the same conversation. We're talking about the power of your links. 

1. Ranking ability

The first thing that everybody knows about is links help you rank. They help you rank in many, many ways. You can get higher rankings. You can attack more keywords, but most importantly, you can attack more competitive keywords.

A good thing I like to do is, when I'm trying to see if I can rank for a keyword, simply Google it, check the Page Authority, which is a very similar metric, of all the top ranking pages, see what your Page Authority is for your top ranking keywords, and you can kind of have a pretty good idea if you have an ability to rank for that keyword. 

2. Crawl budget

But then we get into the nitty-gritty, the other benefits of having that link equity, one of the most important being crawl budget.

When you have more link equity, Google will crawl more of your pages. If you only have a handful of links and a million pages on your website, it's going to be very difficult to get Google to crawl and index all those million pages. If you're eBay or Amazon or Google or a site with like a 100 Domain Authority, yes, you might be able to attract Google to crawl those million pages.

3. Indexation speed

Google will also crawl them faster. You may get Google to crawl your pages with low Domain Authority, but it's going to take a while for Google to visit those again. So then we get into the idea of indexation speed. With a higher Domain Authority, Google is going to crawl and index your content typically much faster than they would without.

So if you have a page that you've updated recently, you're going to see Google update it quicker the more authority that page has. Also you're going to see this in the SERPs. If you have outdated title tags or meta descriptions, you can ask Google to crawl it via the Submit URL tool. But generally, the more authority a page has, the more incoming link power, you're going to see those things updated so much quicker than you would with low link equity.

4. More powerful links

This is my favorite one. With increased link equity, your own links become more powerful, and this gives you incredible ranking power because your internal links, that you're linking to yourself, become more powerful with that link equity. So it makes everything easier to rank. The best link building you can do when you have high authority is linking to yourself, and it's so easy.

But also the links that you link out to other people also become more valuable, which makes you a more attractive target. 

5. Insulation from bad links

My friend Everett Sizemore came up with that word "insulation." With better link equity, you're somewhat protected from a handful of bad links. Now if you have low link equity and you get a bunch of spam links to your site, your risk of penalization or being impacted by negative SEO increases pretty high.

But if you have a million links, a handful of bad links just aren't going to hurt you. A good way to think about this is ratios, because, of course, anybody can get penalized. Anybody can suffer the consequences of bad links. But if those bad links only make up a tiny portion, meaning a small ratio, then you are somewhat insulated by the impact of those bad links.

6. Less over-optimization

Now Google says they don't have an over-optimization penalty. But anecdotally, many SEOs understand that if you're a small site, you're just starting out, it's very easy to over-optimize for keywords with exact match anchor text and not rank. The key usually: in SEO, you want a lot of variety.

With a lot of links, that variety is much easier to get, and you have much less risk of over-optimization in linking internally with exact match anchor text. You can get away with a lot more with higher Domain Authority than you can with less Domain Authority. That's kind of the key to this whole thing. With higher Domain Authority, you just get away with a lot more. It's the idea of the rich getting richer. 

7. The flywheel effect

Rand Fishkin, our friend, likes to talk about the flywheel effect. When you have more links, everything gets easier. When you start ranking and people start seeing you in the SERPs, you're going to get more links from that content, and more links are going to equal more ranking and the wheel is just going to keep turning and turning.

More people want to link to you and amplify you and work with you. You're also going to get a lot more spam requests and link requests and things like that, so it isn't fun. But generally, the more Domain Authority you have, the more PageRank you have, the easier life is going to get, and you just want to start building it up day after day after day. So, like I said, a quick and easy Whiteboard Friday this week.

Hope you enjoyed it. We'll talk to you next time. Thanks, everybody.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!