Friday, 31 January 2020

SEO for 2020 - Whiteboard Friday

Posted by BritneyMuller

It's a brand-new decade, rich with all the promise of a fresh start and new beginnings. But does that mean you should be doing anything different with regards to your SEO?

In this Whiteboard Friday, our Senior SEO Scientist Britney Muller offers a seventeen-point checklist of things you ought to keep in mind for executing on modern, effective SEO. You'll encounter both old favorites (optimizing title tags, anyone?) and cutting-edge ideas to power your search strategy from this year on into the future.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome to another edition of Whiteboard Friday. Today we are talking about SEO in 2020. What does that look like? How have things changed?

Do we need to be optimizing for favicons and BERT? We definitely don't. But here are some of the things that I feel we should be keeping an eye on. 

☑ Cover your bases with foundational SEO

Titles, metas, headers, alt text, site speed, robots.txt, site maps, UX, CRO, Analytics, etc.

To cover your bases with foundational SEO will continue to be incredibly important in 2020, basic things like title tags, meta descriptions, alt, all of the basic SEO 101 things.

There have been some conversations in the industry lately about alt text and things of that nature. When Google is getting so good at figuring out and knowing what's in an image, why would we necessarily need to continue providing alt text?

But you have to remember we need to continue to make the web an accessible place, and so for accessibility purposes we should absolutely continue to do those things. But I do highly suggest you check out Google's Visual API and play around with that to see how good they've actually gotten. It's pretty cool.

☑ Schema markup

FAQ, Breadcrumbs, News, Business Info, etc.

Schema markup will continue to be really important, FAQ schema, breadcrumbs, business info. The News schema that now is occurring in voice results is really interesting. I think we will see this space continue to grow, and you can definitely leverage those different markup types for your website. 

☑ Research what matters for your industry!

Just to keep in mind, there's going to be a lot of articles and research and information coming at you about where things are going, what you should do to prepare, and I want you to take a strategic stance on your industry and what's important in your space.

While I might suggest page speed is going to be really important in 2020, is it for your industry? We should still worry about these things and still continue to improve them. But if you're able to take a clearer look at ranking factors and what appears to be a factor for your specific space, you can better prioritize your fixes and leverage industry information to help you focus.

☑ National SERPs will no longer be reliable

You need to be acquiring localized SERPs and rankings.

This has been the case for a while. We need to localize search results and rankings to get an accurate and clear picture of what's going on in search results. I was going to put E-A-T here and then kind of cross it off.

A lot of people feel E-A-T is a huge factor moving forward. Just for the case of this post, it's always been a factor. It's been that way for the last ten-plus years, and we need to continue doing that stuff despite these various updates. I think it's always been important, and it will continue to be so. 

☑ Write good and useful content for people

While you can't optimize for BERT, you can write better for NLP.

This helps optimize your text for natural language processing. It helps make it more accessible and friendly for BERT. While you can't necessarily optimize for something like BERT, you can just write really great content that people are looking for.

☑ Understand and fulfill searcher intent, and keep in mind that there's oftentimes multi-intent

One thing to think about this space is we've kind of gone from very, very specific keywords to this richer understanding of, okay, what is the intent behind these keywords? How can we organize that and provide even better value and content to our visitors? 

One way to go about that is to consider Google houses the world's data. They know what people are searching for when they look for a particular thing in search. So put your detective glasses on and examine what is it that they are showing for a particular keyword.

Is there a common theme throughout the pages? Tailor your content and your intent to solve for that. You could write the best article in the world on DIY Halloween costumes, but if you're not providing those visual elements that you so clearly see in a Google search result page, you're never going to rank on page 1.

☑ Entity and topical integration baked into your IA

Have a rich understanding of your audience and what they're seeking.

This plays well into entities and topical understanding. Again, we've gone from keywords and now we want to have this richer, better awareness of keyword buckets. 

What are those topical things that people are looking for in your particular space? What are the entities, the people, places, or things that people are investigating in your space, and how can you better organize your website to provide some of those answers and those structures around those different pieces? That's incredibly important, and I look forward to seeing where this goes in 2020. 

☑ Optimize for featured snippets

Featured snippets are not going anywhere. They are here to stay. The best way to do this is to find the keywords that you currently rank on page 1 for that also have a featured snippet box. These are your opportunities. If you're on page 1, you're way more apt to potentially steal or rank for a featured snippet.

One of the best ways to do that is to provide really succinct, beautiful, easy-to-understand summaries, takeaways, etc., kind of mimic what other people are doing, but obviously don't copy or steal any of that. Really fun space to explore and get better at in 2020. 

☑ Invest in visuals

We see Google putting more authority behind visuals, whether it be in search or you name it. It is incredibly valuable for your SEO, whether it be unique images or video content that is organized in a structured way, where Google can provide those sections in that video search result. You can do all sorts of really neat things with visuals. 

☑ Cultivate engagement

This is good anyway, and we should have been doing this before. Gary Illyes was quoted as saying, "Comments are better for on-site engagement than social signals." I will let you interpret that how you will.

But I think it goes to show that engagement and creating this community is still going to be incredibly important moving forward into the future.

☑ Repurpose your content

Blog post → slides → audio → video

This is so important, and it will help you excel even more in 2020 if you find your top-performing web pages and you repurpose them into maybe be a SlideShare, maybe a YouTube video, maybe various pins on Pinterest, or answers on Quora.

You can start to refurbish your content and expand your reach online, which is really exciting. In addition to that, it's also interesting to play around with the idea of providing people options to consume your content. Even with this Whiteboard Friday, we could have an audio version that people could just listen to if they were on their commute. We have the transcription. Provide options for people to consume your content. 

☑ Prune or improve thin or low-quality pages

This has been incredibly powerful for myself and many other SEOs I know in improving the perceived quality of a site. So consider testing and meta no-indexing low-quality, thin pages on a website. Especially larger websites, we see a pretty big impact there. 

☑ Get customer insights!

This will continue to be valuable in understanding your target market. It will be valuable for influencer marketing for all sorts of reasons. One of the incredible tools that are currently available by our Whiteboard Friday extraordinaire, Rand Fishkin, is SparkToro. So you guys have to check that out when it gets released soon. Super exciting. 

☑ Find keyword opportunities in Google Search Console

It's shocking how few people do this and how accessible it is. If you go into your Google Search Console and you export as much data as you can around your queries, your click-through rate, your position, and impressions, you can do some incredible, simple visualizations to find opportunities.

For example, if this is the rank of your keywords and this is the click-through rate, where do you have high click-through rate but low ranking position? What are those opportunity keywords? Incredibly valuable. You can do this in all sorts of tools. One I recommend, and I will create a little tutorial for, is a free tool called Facets, made by Google for machine learning. It makes it really easy to just pick those apart. 

☑ Target link-intent keywords

A couple quick link building tactics for 2020 that will continue to hopefully work very, very well. What I mean by link-intent keywords is your keyword statistics, your keyword facts.

These are searches people naturally want to reference. They want to link to it. They want to cite it in a presentation. If you can build really great content around those link-intent keywords, you can do incredibly well and naturally build links to a website. 

☑ Podcasts

Whether you're a guest or a host on a podcast, it's incredibly easy to get links. It's kind of a fun link building hack. 

☑ Provide unique research with visuals

Andy Crestodina does this so incredibly well. So explore creating your own unique research and not making it too commercial but valuable for users. I know this was a lot.

There's a lot going on in 2020, but I hope some of this is valuable to you. I truly can't wait to hear your thoughts on these recommendations, things you think I missed, things that you would remove or change. Please let us know down below in the comments, and I will see you all soon. Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



* This article was originally published here

Thursday, 30 January 2020

How to Check if Google Manually Reviewed Your Site

Do you know how Google decides what website should be ranked number 1, 2, 3 and so on for any given keyword?

Well, they have an algorithm for that.

But as you know, algorithms aren’t perfect. That’s why Google continually tries to improve it.

One way that they try to improve their algorithm is through Search Quality Raters.

What’s a Search Quality Rater?

Google knows that they can always make their search results better. And one way is to have humans review their listings for any given keyword.

So, all around the world, Google has people who manually review websites. And they review each website based on these guidelines.

It’s kind of long and extensive, but it is important that the Quality Raters don’t directly impact rankings.

Instead, they give feedback to the engineers who code up the algorithm so they can make it more relevant to searchers.

Now, the real question is, how do you know your site is being reviewed?

First, I want you to log into your Google Analytics account and go to the audience overview report.

Then click on “Add Segment.”

Your screen should look something like this:

Then click on “+ New Segment.”

Your screen should look like the image above.

I want you to click “Conditions,” which is under the “Advanced” navigation label. Once you do that, fill out everything to match the screenshot below and click “save”.

Just make sure that when you are filling out the table you are clicking the “or” button and not the “and” button. And make sure you select “Source” for the first column.

Now that you’ve created the new segment, it’s time to see if any Quality Raters have viewed your site.

How to spot Quality Raters

When you are in Google Analytics, you’ll want to make sure you select the segment you just created.

If you copied my screenshot, you would have labeled it “Search Engine Evaluators.” And when you select it, you’ll probably see a graph that looks something like the image below.

You’ll notice that no Quality Raters have been to my site during the selected date period, which is common as they don’t visit your site daily and, in many cases, they don’t come often at all.

The other thing you’ll notice is that next to the “Audience Overview” heading, there is a yellow shield symbol. If your symbol is green, then that’s good.

Yellow means your data is being sampled.

If you see the yellow symbol, reduce your date range and you’ll eventually see a green shield next to “Audience Overview” like the image below.

In general, it is rare that Quality Raters view your site each month. But as you expand your time window, you’ll be able to spot them.

And once you spot them, you can shorten the date range so the data isn’t sampled and then drill down to what they were looking at on your website.

The key to analyzing what Quality Raters are doing on your site is to look at the “Site Content” report in Google Analytics and that will help you produce results that look like the screenshot above.

To get to that report, click on “Behavior,” then “Site Content,” and then “All Pages.”

What do I do with this information?

The goal of a Quality Rater is to help improve Google’s algorithm. And whether they have visited your site or not, your goal should be to make your site the best site in the industry.

You can do so by doing the following 3 things:

  1. Follow the quality guidelines that Google has released. It’s 168 pages long but, by skimming it, you can get a good understanding of what they are looking for.
  2. Always put the user first. Yes, you want higher rankings, but don’t focus on Google, focus on the user. In the long run, this should help you rank higher as Google’s goal is to make their algorithm optimized for user preferences over things like on-page SEO or link building.
  3. Check out Google’s advice for beating algorithm changes. In that article, you’ll find a breakdown of what Google is really looking for.

Conclusion

If you have Quality Raters browsing your site from time to time, don’t freak out. It doesn’t mean your rankings are going to go down or up.

And if you can’t find any Quality Raters visiting your site, don’t freak out either. Because that doesn’t mean that you won’t ever rank well in Google.

As your site gets more popular, you’ll notice a higher chance of Quality Raters visiting your site over time. This just means that you need to focus more on delighting your website visitors. Create the best experience for them and you’ll win in the long run.

So, have you spotted any Quality Raters in your Google Analytics?

PS: Special shoutout to Matthew Woodward who originally brought the Google Quality Raters segmentation to light.

The post How to Check if Google Manually Reviewed Your Site appeared first on Neil Patel.



* This article was originally published here

Wednesday, 29 January 2020

The Dirty Little Featured Snippet Secret: Where Humans Rely on Algorithmic Intervention [Case Study]

Posted by brodieclarkconsulting

I recently finished a project where I was tasked to investigate why a site (that receives over one million organic visits per month) does not rank for any featured snippets.

This is obviously an alarming situation, since ~15% of all result pages, according to the MozCast, have a featured snippet as a SERP feature. The project was passed on to me by an industry friend. I’ve done a lot of research on featured snippets in the past. I rarely do once-off projects, but this one really caught my attention. I was determined to figure out what issue was impacting the site.

In this post, I detail my methodology for the project that I delivered, along with key takeaways for my client and others who might be faced with a similar situation. But before I dive deep into my analysis: this post does NOT have a fairy-tale ending. I wasn’t able to unclog a drain that resulted in thousands of new visitors.

I did, however, deliver massive amounts of closure for my client, allowing them to move on and invest resources into areas which will have a long-lasting impact.

Confirming suspicions with Big Data

Now, when my client first came to me, they had their own suspicions about what was happening. They had been advised by other consultants on what to do.

They had been told that the featured snippet issue was stemming from either:

1. An issue relating to conflicting structured data on the site

OR

2. An issue relating to messy HTML which was preventing the site from appearing within featured snippet results

I immediately shut down the first issue as a cause for featured snippets not appearing. I’ve written about this topic extensively in the past. Structured data (in the context of schema.org) does NOT influence featured snippets. You can read more about this in my post on Search Engine Land.

As for the second point, this is more close to reality, yet also so far from it. Yes, HTML structure does help considerably when trying to rank for featured snippets. But to the point where a site that ranks for almost a million keywords but doesn’t rank for any featured snippets at all? Very unlikely. There’s more to this story, but let’s confirm our suspicions first.


Let’s start from the top. Here’s what the estimated organic traffic looks like:

Note: I’m unable to show the actual traffic for this site due to confidentiality. But the monthly estimation that Ahrefs gives of 1.6M isn’t far off.

Out of the 1.6M monthly organic visits, Ahrefs picks up on 873K organic keywords. When filtering these keywords by SERP features with a featured snippet and ordering by position, you get the following:

I then did similar research with both Moz Pro using their featured snippet filtering capabilities as well as SEMrush, allowing me to see historical ranking.

All 3 tools displaying the same result: the site did not rank for any featured snippets at all, despite ~20% of my client's organic keywords including a featured snippet as a SERP feature (higher than the average from MozCast).

It was clear that the site did not rank for any featured snippets on Google. But who was taking this position away from my client?

The next step was to investigate whether other sites are ranking within the same niche. If they were, then this would be a clear sign of a problem.

An “us” vs “them” comparison

Again, we need to reflect back to our tools. We need our tools to figure out the top sites based on similarity of keywords. Here’s an example of this in action within Moz Pro:

Once we have our final list of similar sites, we need to complete the same analysis that was completed in the previous section of this post to see if they rank for any featured snippets.

With this analysis, we can figure out whether they have featured snippets displaying or not, along with the % of their organic keywords with a featured snippet as a SERP feature.

The next step is to add all of this data to a Google Sheet and see how everything matches up to my client's site. Here’s what this data looks like for my client:

I now need to dig deeper into the sites in my table. Are they really all that relevant, or are my tools just picking up on a subset of queries that are similar?

I found that from row 8 downwards in my table, those sites weren’t all that similar. I excluded them from my final dataset to keep things as relevant as possible.

Based on this data, I could see 5 other sites that were similar to my clients. Out of those five sites, only one had results where they were ranking within a featured snippet.

80% of similar sites to my client's site had the exact same issue. This is extremely important information to keep in mind going forward.

Although the sample size is considerably lower, one of those sites has ~34% of search results that they rank for where they are unable to be featured. Comparatively, this is quite problematic for this site (considering the 20% calculation from my client's situation).

This analysis has been useful in figuring out whether the issue was specific to my client or the entire niche. But do we have guidelines from Google to back this up?

Google featured snippet support documentation

Within Google’s Featured Snippet Documentation, they provide details on policies surrounding the SERP feature. This is public information. But I think a very high percentage of SEOs aren’t aware (based on multiple discussions I’ve had) of how impactful some of these details can be.

For instance, the guidelines state that: 

"Because of this prominent treatment, featured snippet text, images, and the pages they come from should not violate these policies." 

They then mention 5 categories:

  1. Sexually explicit
  2. Hateful
  3. Violent
  4. Dangerous and harmful
  5. Lack consensus on public interest topics

Number five in particular is an interesting one. This section is not as clear as the other four and requires some interpretation. Google explains this category in the following way:

"Featured snippets about public interest content — including civic, medical, scientific, and historical issues — should not lack well-established or expert consensus support."

And the even more interesting part in all of this: these policies do not apply to web search listings nor cause those to be removed.

It can be lights out for featured snippets if you fall into one of these categories, yet you can still be able to rank highly within the 10-blue-link results. A bit of an odd situation.

Based on my knowledge of the client, I couldn’t say for sure whether any of the five categories were to blame for their problem. It was sure looking like it was algorithmic intervention (and I had my suspicions about which category was the potential cause).

But there was no way of confirming this. The site didn’t have a manual action within Google Search Console. That is literally the only way Google could communicate something like this to site owners.

I needed someone on the inside at Google to help.

The missing piece: Official site-specific feedback from Google

One of the most underused resources in an SEOs toolkit (based on my opinion), are the Google Webmaster Hangouts held by John Mueller.

You can see the schedule for these Hangouts on YouTube here and join live, asking John a question in person if you want. You could always try John on Twitter too, but there’s nothing like video.

You’re given the opportunity to explain your question in detail. John can easily ask for clarification, and you can have a quick back-and-forth that gets to the bottom of your problem.

This is what I did in order to figure out this situation. I spoke with John live on the Hangout for ~5 minutes; you can watch my segment here if you’re interested. The result was that John gave me his email address and I was able to send through the site for him to check with the ranking team at Google.

I followed up with John on Twitter to see if he was able to get any information from the team on my clients situation. You can follow the link above to see the full piece of communication, but John’s feedback was that there wasn't a manual penalty being put in place for my client's site. He said that it was purely algorithmic. This meant that the algorithm was deciding that the site was not allowed to rank within featured snippets.

And an important component of John’s response:


If a site doesn’t rank for any featured snippets when they're already ranking highly within organic results on Google (say, within positions 1–5), there is no way to force it to rank.

For me, this is a dirty little secret in a way (hence the title of this article). Google’s algorithms may decide that a site can’t show in a featured snippet (but could rank #2 consistently), and there's nothing a site owner can do.

...and the end result?

The result of this, in the specific niche that my client is in, is that lots of smaller, seemingly less relevant sites (as a whole) are the ones that are ranking in featured snippets. Do these sites provide the best answer? Well, the organic 10-blue-links ranking algorithm doesn’t think so, but the featured snippet algorithm does.

This means that the site has a lot of queries which have a low CTR, resulting in considerably less traffic coming through to the site. Sure, featured snippets sometimes don’t drive much traffic. But they certainly get a lot more attention than the organic listings below:

Based on the Nielsen Norman Group study, when SERP features (like featured snippets) were present on a SERP, they found that they received looks in 74% of cases (with a 95% confidence interval of 66–81%). This data clearly points to the fact that featured snippets are important for sites to rank within where possible, resulting in far greater visibility.

Because Google’s algorithm is making this decision, it's likely a liability thing; Google (the people involved with the search engine) don’t want to be the ones to have to make that call. It’s a tricky one. I understand why Google needs to put these systems in place for their search engine (scale is important), but communication could be drastically improved for these types of algorithmic interventions. Even if it isn’t a manual intervention, there ought to be some sort of notification within Google Search Console. Otherwise, site owners will just invest in R&D trying to get their site to rank within featured snippets (which is only natural).

And again, just because there are categories available in the featured snippet policy documentation, that doesn’t mean that the curiosity of site owners is always going to go away. There will always be the “what if?”

Deep down, I’m not so sure Google will ever make this addition to Google Search Console. It would mean too much communication on the matter, and could lead to unnecessary disputes with site owners who feel they’ve been wronged. Something needs to change, though. There needs to be less ambiguity for the average site owner who doesn’t know they can access awesome people from the Google Search team directly. But for the moment, it will remain Google’s dirty little featured snippet secret.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



* This article was originally published here

Tuesday, 28 January 2020

Google's January 2020 Core Update: Has the Dust Settled?

Posted by Dr-Pete

On January 13th, MozCast measured significant algorithm flux lasting about three days (the dotted line shows the 30-day average prior to the 13th, which is consistent with historical averages) ...

That same day, Google announced the release of a core update dubbed the January 2020 Core Update (in line with their recent naming conventions) ...

On January 16th, Google announced the update was "mostly done," aligning fairly well with the measured temperatures in the graph above. Temperatures settled down after the three-day spike ...

It appears that the dust has mostly settled on the January 2020 Core Update. Interpreting core updates can be challenging, but are there any takeaways we can gather from the data?

How does it compare to other updates?

How did the January 2020 Core Update stack up against recent core updates? The chart below shows the previous four named core updates, back to August 2018 (AKA "Medic") ...

While the January 2020 update wasn't on par with "Medic," it tracks closely to the previous three updates. Note that all of these updates are well above the MozCast average. While not all named updates are measurable, all of the recent core updates have generated substantial ranking flux.

Which verticals were hit hardest?

MozCast is split into 20 verticals, matching Google AdWords categories. It can be tough to interpret single-day movement across categories, since they naturally vary, but here's the data for the range of the update (January 14–16) for the seven categories that topped 100°F on January 14 ...

Health tops the list, consistent with anecdotal evidence from previous core updates. One consistent finding, broadly speaking, is that sites impacted by one core update seem more likely to be impacted by subsequent core updates.

Who won and who lost this time?

Winners/losers analyses can be dangerous, for a few reasons. First, they depend on your particular data set. Second, humans have a knack for seeing patterns that aren't there. It's easy to take a couple of data points and over-generalize. Third, there are many ways to measure changes over time.

We can't entirely fix the first problem — that's the nature of data analysis. For the second problem, we have to trust you, the reader. We can partially address the third problem by making sure we're looking at changes both in absolute and relative terms. For example, knowing a site gained 100% SERP share isn't very interesting if it went from one ranking in our data set to two. So, for both of the following charts, we'll restrict our analysis to subdomains that had at least 25 rankings across MozCast's 10,000 SERPs on January 14th. We'll also display the raw ranking counts for some added perspective.

Here are the top 25 winners by % change over the 3 days of the update. The "Jan 14" and "Jan 16" columns represent the total count of rankings (i.e. SERP share) on those days ...

If you've read about previous core updates, you may see a couple of familiar subdomains, including VeryWellHealth.com and a couple of its cousins. Even at a glance, this list goes well beyond healthcare and represents a healthy mix of verticals and some major players, including Instagram and the Google Play store.

I hate to use the word "losers," and there's no way to tell why any given site gained or lost rankings during this time period (it may not be due to the core update), but I'll present the data as impartially as possible. Here are the 25 sites that lost the most rankings by percentage change ...

Orbitz took heavy losses in our data set, as did the phone number lookup site ZabaSearch. Interestingly, one of the Very Well family of sites (three of which were in our top 25 list) landed in the bottom 25. There are a handful of healthcare sites in the mix, including the reputable Cleveland Clinic (although this appears to be primarily a patient portal).

What can we do about any of this?

Google describes core updates as "significant, broad changes to our search algorithms and systems ... designed to ensure that overall, we’re delivering on our mission to present relevant and authoritative content to searchers." They're quick to say that a core update isn't a penalty and that "there’s nothing wrong with pages that may perform less well." Of course, that's cold comfort if your site was negatively impacted.

We know that content quality matters, but that's a vague concept that can be hard to pin down. If you've taken losses in a core update, it is worth assessing if your content is well matched to the needs of your visitors, including whether it's accurate, up to date, and generally written in a way that demonstrates expertise.

We also know that sites impacted by one core update seem to be more likely to see movement in subsequent core updates. So, if you've been hit in one of the core updates since "Medic," keep your eyes open. This is a work in progress, and Google is making adjustments as they go.

Ultimately, the impact of core updates gives us clues about Google's broader intent and how best to align with that intent. Look at sites that performed well and try to understand how they might be serving their core audiences. If you lost rankings, are they rankings that matter? Was your content really a match to the intent of those searchers?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



* This article was originally published here