Blog

8 Advanced SEO Skills & Where to Learn Them

Like many SEOs, it’s become a bit of a tradition for me to publish a reflective piece at the end of each year citing my “SEO predictions” for the coming year. Alas, since nothing is normal about 2020, I wanted to do something slightly different and instead of speaking directly to SEO trends, I’m going to share my recommendations for how you can amp up your SEO game in the coming year(s), and where you can go to learn these skills.

The thing I love most about the SEO field is how it’s constantly changing. This is pretty standard in the Tech world, but there’s something about SEO that constantly keeps you on your toes and forces you to grow, learn, adapt, and add new skills and tools to your repertoire each year.

Therefore, these suggestions are founded in my prediction that SEO will only continue to get more technical over time. This is only natural because SEO is essentially a game of outwitting Google and catering to the end-user, so as Google gets smarter and the end-user becomes more tech-savvy, so must we. J

8 Advanced SEO Skills to Learn in 2021 and Beyond

  1. Python
  2. Data Analytics (custom reports, Google Tag Manager, GA, etc)
  3. HTML & CSS
  4. Local SEO (+ how to work with local aggregators like Yext)
  5. Natural Language Processing (NLP)
  6. Web Entities
  7. International SEO (+ using HREFLANG tags)
  8. Log File Analyses

1. Python

Python is a dynamic programming language that can be used for many things, including application development, data analysis, and machine learning.

Because of its flexibility, relative ease of use even for beginners, and processing speeds, many SEOs have begun to use it to automate certain tasks such as:

  • Analyzing broken links on a site (learn how)
  • Writing image alt tags and captions (learn how)
  • Finding your BERT scores to show how well certain pages correlate with keyword meanings (learn how)
  • Identifying keyword cannibalization issues (learn how)
  • Creating an XML sitemap with up to 50,000 URLs (learn how)
  • Automating the process of categorizing keywords by user intent (learn how)

I believe we are just scratching the surface of how we can use Python to extract and analyze SEO-related data, so this is a skill that SEOs will need to have at least a basic understanding of moving forward.

To learn more, just Google “python for SEO” and enjoy nearly 40 million results! 🙂

2. Data Analytics

Being able to analyze your own data has always been important when it comes to SEO, but as the reliance upon data only continues to grow across all industries and companies, the need to be able to collect and analyze your own data is becoming even more important.

Luckily, there are a number of easy ways that SEOs can learn more about data analytics and how to track, measure and report on important data points. Below are some options that are either free or very low-cost that will help you increase your savviness when it comes to data analytics:

Google Analytics trainings on Skillshop
IBM Data Analyst Professional Certification via Coursera
Google Tag Manager Fundamentals via Google
Learn about Google Data Studio via Google Support
Explore various Google Analytics courses on Udemy (these are paid, but most courses range from a reasonable $12.99 to $18.99)
Free data science courses via Coursera

I’d highly recommend at least brushing up on your Google Analytics and Google Tag Manager skills since these are free to learn and the tools are widely used.

3. HTML + CSS

Knowing some HTML and CSS is vital in today’s SEO climate. The reason for this is that as an SEO, you need to be able to communicate with developers who may also be assigned to the project and who can help implement more technical SEO recommendations.

Also, understanding some basic HTML and CSS can help you conduct more thorough website audits and identify potential code issues that may be causing issues on your site (like slow load times, duplicate content, canonicalization issues, “hidden” text, wonky URL issues, tagging and analytics issues, etc.)

I can’t tell you how many times I have had to dig through the source code of a site in order to pinpoint where an SEO problem was coming from, and then be able to have an effective conversation with a developer on how to address the issue!  

I highly recommend exploring Codecademy and freeCodeCamp for excellent, free resources. You can also challenge yourself and get involved with something fun like #100DaysofCode.

4. Local SEO (+ how to work with local aggregators like Yext)

Local SEO is here to stay, in a big way. In fact, according to ahrefs:
– 30% of all mobile searches are related to location
– 76% of people who search on their smartphones search for something nearby and visit within a day
– Certain “near me” searches (like “where to buy near me” and “store open near me”) have grown more than 200% between 2017-2019

Other studies have shown that upwards of 72% of SERPs now have a local feature included on them, which is a huge number!

So, if you want to keep your SEO game strong you also need to be well-versed in Local SEO. Part of this is understanding how the Local map pack works. Another part is understanding how local rankings are determined (hint: consistent NAP+W business information (Name, Address, Phone, Website) is a big factor). And a final part is being able to manage your listings effectively with platforms like Google My Business (helpful for small and medium-sized businesses) or aggregators like Yext and Moz Local, which are helpful for managing large, nationwide businesses with many locations.

A simple Google search for something like “how to learn local SEO” will yield many results and great resources, but here are a few that I have used to hone my skills:

Moz’s Local SEO guides found within its SEO Learning Center
Ahrefs Guide to Local SEO
SearchEngine Journal’s Guide to Local SEO

5. Natural Language Processing (NLP)

Natural Language Processing has become even more important in the days since Google introduced BERT in 2019, an AI function that allows machines to be pre-trained to understand words and extract nuanced language from those words; in other words, it helps Google (and any other machine that uses BERT) become smarter and better at interpreting search queries and other inputs.

The reason that I’m including this in this post about Advanced SEO tips is that I think moving into 2021 and beyond, that you have to understand what BERT is in order to continue being successful at SEO.  

Sure, if you’ve already been creating amazing content and focusing on incorporating semantic keywords into your site and content, then maybe not much will change in your optimization efforts.

However, we have to be aware of the fact that due to Google’s ability to discern the underlying meaning and intent behind a searcher’s behavior, a user searching for “leaf peeping trips northeast” may not be shown results that literally have “leaf peeping” in the title or even the entire page itself – instead, they can now see results that only cover “fall foliage” and “foliage maps.”

Why? Because Google isn’t taking things so literally anymore – it’s interpreting. It knows that if I search for “leaf peeping” it means I am interested in taking a trip to see the fall foliage, probably in the New England area.

To learn a little bit more about this phenomenon and tips for optimizing for NLP, check out Tip #1 in an article I wrote last year called “5 Tips to Help You Win at SEO in 2020.” I’d also recommend the two articles linked to below, or just doing a search for “what is bert seo” for more information.

https://moz.com/blog/what-is-bert
https://searchengineland.com/faq-all-about-the-bert-algorithm-in-google-search-324193

6. Web Entities & “Fraggles”

When I attended MozCon in 2019, Cindy Krum blew me away with her theory about “fraggles,” which is essentially a word she uses to describe pieces of information held together by similar language and meaning. This isn’t a new concept or theory (I believe Cindy originally began talking about fraggles around 2017, and the concept of entities has been around for even longer than that), however, I still think that this topic is not something many SEOs know about or care to apply to their everyday practices. I also believe that this concept is only going to continue becoming more relevant over time, so it’s better late than never to learn about it.

The idea relates to how Google, empowered with things like RankBrain and BERT, today ranks and indexes entities not just websites, and how Google is really just in the business of giving people answers.

Perhaps unfortunately for us SEOs, these answers no longer have to come in the form of whole websites or web pages; Google is gathering small bits of information all the time which are then used to give people answers right there in the SERPs through People Also Ask accordions, featured snippets, Knowledge Graph results, etc.

Therefore, we now have to consider the relationship between all of our (or our client’s) pieces of information that are hanging around on the web (local listings, web pages, reviews, social media, articles, etc.), and how / where those pieces of information (“fraggles”) are showing up.

That means we can no longer focus only on optimizing the main website; we have to be diligent as SEOs about how we are creating content, organizing that content on web pages, and marking up content and pieces of information with schema and such.

To learn more about entities and fraggles, check out these resources:
https://mobilemoxie.com/blog/what-the-heck-are-fraggles/ (This is straight from Cindy’s company, MobileMoxie)
– Whiteboard Friday on the topic: https://moz.com/blog/fraggles
– Podcast episode on the topic: https://www.kevin-indig.com/podcast/the-future-of-entities-fraggles-and-api-indexing-w-cindy-krum/

7. International SEO

As the world continues to shrink and become an even deeper, no-boundaries global economy, understanding international SEO is likely not something you’ll be able to avoid for long.

For some, international SEO may come easy, but for others who are used to working on more localized brands and websites (like myself, to be honest), it can be a little harder to get the hang of; after all, there can be so many questions! How do you format your URLs? Should the sites in different languages be within subfolders on the main domain, function as subdomains, or be separate ccTLds altogether – or a gTLD? What’s the best way to code and format language switchers for bots and users? Should you do language or country targeting?

As you can see, this can open up a whole can of worms and – depending on the maturity level of your website or your client’s website, can be quite the headache. Luckily, there are some great resources out there to help you learn more about international SEO. I’ll link to a couple below:

– My personal favorite resource: https://moz.com/learn/seo/international-seo
– A beast of an article but very helpful if you find yourself in the middle of an international SEO project or site migration: https://www.searchviu.com/en/cctlds-to-gtld
https://www.semrush.com/blog/how-to-choose-the-right-international-seo-site-structure/
– This article obviously relies on SEMRush quite a bit, but I think it’s still worth a read even if you don’t have SEMRush because the author gives great tips and links out to many other good resources including an hreflang tag generator: https://www.semrush.com/blog/how-to-develop-your-international-seo-strategy/

8. Log File Analyses

Log file analyses aren’t necessarily new, but they have become increasingly popular among technical SEOs over the past few years – and for good reason!

Log files offer another (and far more accurate) way to crawl your website as a bot would and gain valuable insights into things like crawl budget, error codes, active and inactive pages, etc. The insights from a log file analysis are more accurate because they come straight from your own server, vs. a third-party tool that may be limited or can then help you identify areas for improvement and optimization.

Below are some great resources for learning more about log files and how to perform a log file analysis. Happy logging!

https://builtvisible.com/log-file-analysis/
https://getstat.com/blog/uncovering-seo-opportunities-via-log-files/
https://www.screamingfrog.co.uk/log-file-analyser/
https://moz.com/blog/technical-seo-log-analysis

Honorable Mentions

These skills are definitely important, but I felt that they were either A) already understood to be important by most SEOs, or B) hard to cover in this blog post without it ending up being like one of Neil Patel’s infamous 10,000+ word posts 😉

My recommendation is to set aside some time to research these topics, understand what they are / what they mean, and – in the case of tangible items like the video sitemaps, content audits and speed improvements – know how to actually do them. I can guarantee that if you do this, you will be leagues ahead of most SEOs, and make yourself an invaluable part of any organization or client team that you work on.

  • PageSpeed Improvements
  • Video Sitemaps
  • Content Audits
  • JSON-LD (schema markup)
  • Javascript + SEO Implications — I wrote a whole post on SEO considerations when thinking about adopting an SPA/Javascript React framework if you’d like some light bedtime reading 😉
  • Advanced Tool Usage — learn how to use Screaming Frog, DeepCrawl, Ahrefs, etc to their full capabilities to get the most out of them and make your life a little easier
  • Customer Data Platforms (“CDPs”) — This isn’t directly related to SEO, but as someone who is responsible for bringing in traffic to a website and – hopefully – monitoring whether that traffic converts to a meaningful sale or inquiry, you should be familiar with these types of software because I believe they are only going to keep gaining in popularity

5 Tips to Help You Win at SEO in 2020

It’s hard to believe that we’re already an entire month into 2020. It seems like just yesterday we were all celebrating Y2K and then shortly thereafter optimizing websites with keywords in white text at the bottom of each page. 

Thankfully we’ve graduated from gaming the system and now get to focus on creating the highest quality experiences for our website visitors. 

So, what ranking factors are in store for SEO in 2020? Let’s dig in because there’s a lot to unpack. 

5 Important SEO Ranking Factors in 2020

#1: Focus on the user

With the introduction of BERT last October (2019), we now have a framework for increasing AI and machine learning capacity within Google (and amongst other Natural Language processing tasks). BERT wasn’t a standard algorithm update, but it has completely upped the stakes when it comes to machines’ contextual understanding of language and conversational search.

What this means for SEOs is that Google is only going to continue getting smarter. It’s going to get better at understanding what users really mean when they’re searching for ambiguous queries like “bank” (financial institution) vs. “bank” (of a river) — or even more nuanced language like how the meaning of the word “get” or “run” can change drastically based on the words it’s surrounded by. 

This is how we speak as humans and now with BERT, machines (like Google) will be better able to interpret these nuanced phrases and anticipate the user’s intent.

This means that we as SEOs need to focus on: 

  • Anticipating the needs and intention of our target audience, and creating web pages and content that speak to their needs

  • Expanding our keyword research to not be so narrowly and literally centered on the product or service itself, but tapping into Latent Semantic Indexing (LSI) to find the terms that indirectly relate and surround that product or service.

    For example, if you sell dog food, don’t just target “dog food” — think what your users might be looking for or wanting and incorporate that as well, such as “how to transition my dog to a new dog food,” “is grain-free dog food really necessary,” or “best dog food to help my dog lose weight.” Then create an epic piece of content around those phrases, vs. just having a single product page.

    In other words, don’t be a salesperson, be a helper. Your SEO will reflect this positively.

  • Creating a great on-site user experience. Add real value to your user as they arrive on your site by creating content and an on-site experience that meets and anticipates their needs. Wow and inspire them vs. just selling to them.

    Using the dog food example again — why just have a product page when you could have a quiz developed to help them determine which type of dog food formula would be best for their unique situation? Or providing funny “review” videos from dogs themselves demonstrating how dogs love your product?

    User experience has been a critical part of SEO for several years now, but in 2020 it will likely matter more than ever.

#2: Yes, you still need to create content. But make it purposeful

Content in 2020 will still be important when it comes to SEO. However, I think we need to reframe why content is important. 

For years after we marketers learned that “content was king” and saw how it helped positively influence rankings, everyone jumped on-board and began flooding the web with content. Nowadays there is an enormous amount of blog posts, white papers, guides, infographics, etc. out there on nearly every topic you could imagine, and frankly it’s just hard to stand out unless you’re publishing 30,000-word guides and have a full production team behind you.

That’s why we need to reframe the role content plays in SEO. Walking around saying “content is king” won’t do it anymore — now it’s more like “User is king/queen” and content should just support that. 

If you’re sensing a trend you’re right, we’re right back to point #1 of focusing on the user. 

At the end of the day, yes — do your keyword research. Yes, create an outline so that your piece of content hits on some SEO-friendly topics. Yes, write your meta data and label your headings before publishing. 

Dot your basic SEO i’s, but to be honest, some of the best content that is created is content that is helpful. It’s not written from a sales perspective, it’s not touting your own product or service every paragraph, and it’s not just chocked full of keywords. The best and most-effective content even from an SEO perspective is the content that users want to read. It’s content that truly understands what’s on the user’s mind and either: 

  •  answers a question
  •  solves a problem
  •  educates and/or enlightens
  •  delights and/or inspires

Keyword research can help identify some of these points and give you insight into what your audience is interested in, curious about, or having problems with, but creating purposeful content based on user interviews, customer service website chats and/or calls, or other user feedback is also a wonderful way to connect to your audience. 

#3: Make sure your website is technically sound

Technical SEO has long been a part of any SEOs repertoire, but I think we need to become even smarter about this in 2020. Why is this so important? Yep, you guessed it: User experience. 🙂 

If we want to provide a good user experience, we need to have a technically sound website. This is often one of the first things we work with clients on because I think it’s that important — it sets the stage and foundation for everything. 

Here’s a few common things to watch out for when it comes to technical SEO: 

  • Site speed on both mobile and desktop. I recommend using GTMetrix or Google’s page speed tool to test this and identify areas of opportunity for improving 
  • Broken links. SEMRush or Screaming Frog are great tools for finding broken links on your site, but you can also use a free tool like BrokenLinkCheck.com if you’re on a budget
  • URLs. Audit your URLs to check for structure (no special characters, underscores or stop words), and also see if there’s areas where you can improve your folder structure if it makes sense. For instance, instead of having service pages that are essentially on their own, consider whether it’s worth it to put them in a subfolder like /services/service-name. This won’t always be the best answer because changing URLs can have big SEO implications, but I’ve seen great results from reorganizing URLs if it’s done correctly (ie don’t forget your 301 redirects).
  • XML sitemap and robots.txt file. You’d be surprised at just how many sites I run across that are not utilizing these technical documents properly. Always double-check these to ensure that you are telling Google the right pages you want indexed, and which should be ignored. It’s a simple check and usually a simple fix, but not something to be overlooked. 
  • Structured data. It’s 2020 folks, which means you need to be using structured data (like Schema) on your website. Use this to mark up key information such as business info, contact info, location info and blog posts. This can do wonders for your SEO and help you stand out from a crowd of sites who may not yet be using this to their advantage. Also as a bonus: structured data can help make your site/page/post more eligible to show up for Google’s Featured Snippets, which we’ll talk about next.

#4: Go after Featured Snippets

Google’s Featured Snippets have been all over the SEO news world this past week due to a “deduplicate” update released January 22, 2020. This update made it so that a website cannot occupy the Featured Snippet (sometimes referred to as “position zero” because it appears at the top of the SERP) and a traditional blue link down below. 

However, this shouldn’t discourage you from pursuing SERP Features / Featured Snippets — which include things like People Also Ask (PAAs) and Instant Answer Boxes — because they’re currently the best way to get exposure for your brand and website. 

This is especially true now that more than 50% of searches are zero-click searches (meaning they don’t result in a click through to a website), because people are getting the information they need straight from the SERP instead of going deeper into a website or article. 

So if you’re not showing up front-and-center for relevant queries, there’s a chance your website won’t actually get traffic even if you are technically ranking on page 1. 

I could go on for days about featured snippets (and often do with our clients), but we’ve still got one more factor to go so for now I’ll leave you with a great resource on how to optimize for featured snippets: https://moz.com/blog/optimize-featured-snippets

#5: Button-up your brand SEO

The last factor that I want to cover as part of this “SEO in 2020” post, is focusing on your Brand SEO. 

I’ve been doing SEO for 10+ years so this factor was a little hard for me to wrap my SEO brain around because for most of my career we harped so much on the NON-branded SEO and showing up for NON-branded terms instead of just ranking for your brand. 

However, due to significant changes within Google, the huge uptick in zero-click searches, and just all-around competitiveness within the SERPs for nearly all categories, it’s so important to own your brand within Google. 

What exactly does this look like? Let’s break it down real quick: 

  • Your brand’s Knowledge Graph. Whenever someone searches for your brand, you are eligible to show up in Google’s KG on the right-hand side of Google. However, you need to optimize your Google My Business profile and — depending on the size of your business and brand — sources like Wikipedia as well. Ensuring that these are fully up-to-date and accurate will help influence the KG and present your business well to searchers
  • Claim your local listings if you’re a brick-and-mortar business. By now we’re all familiar with Google’s local 3-pack (or “map pack”) for localized searches. If you have a brick-and-mortar business, you must find a way to own this space and the other local listings sites like Yelp. I suggest that if you are an enterprise company looking to up your SEO game, that you look into Yext local listings platform. If you’re a small-to-medium size business, Moz Local is a more affordable option that still helps you aggregate and update your local listings all from a centralized platform. 
  • Finally, this sounds obvious but… make sure you are ranking for your brand name. I’m constantly shocked when I find a brand not actually ranking #1 for their brand name! If this is the case, please follow steps 1-4, ensure your brand name is in your title tags and spelled out in your urls (don’t abbreviate your brand name for your domain name), and then craft a strategy to really dominate the SERPs for your brand name. Because if you can’t then someone else will, and that’s not what you want to have happen when there’s already so much competition out there.

SEO Considerations & Implications of Adopting a SPA/JavaScript React Framework

Recently (as in, the past year or so) there has been a lot of discussion around the SEO implications of migrating a site to a JS framework. 

This is a really tricky topic to dissect and I feel as if little was truly known about what would happen, as most clients/sites are slow to adopt this process. Recently we’re seeing actual case studies and results come from this migration though, such as these results shared recently on Twitter by Pedro Dias: 

So, I think the time has come to look JS in the eye and say, “Hey: why do you tank my SEO?” 

Grab a beer (or tea. or coffee.) because things are about to get technical. 

To understand why a JavaScript framework may not be great for SEO, we need to go back to the basics to remember how Google actually renders and retrieves information. 

In general, there are three main aspects to the search engine retrieval process: 

  1. Crawler (i.e. Googlebot)
  2. Indexer (for Google, this is called “Caffeine”)
  3. Query Engine (i.e. the platform itself, such as Google)

The Crawler’s job is to find all URLs on the web (or a particular website), and to crawl them. This is done by reading HTML, and by following any URLs found within the traditional <a href=” “> snippet. 

Once the Crawler has found its content, it sends that content to the Indexer, whose job it is to render the content at that particular URL (or set of URLs), and to make sense of the page(s). This process depends on many things such as page layout and PageRank (which Google does still use internally to determine a URL’s authority), as well as executing JavaScript.

This relationship can be cyclical, as the Crawler sends information to the Indexer, and then the Indexer may send information back to the Crawler as it discovers new URLs by rendering the page / executing JavaScript. The Indexer also helps the Crawler prioritize URLs based on what it determines to be high-value URLs. This affects how often the Crawler visits your website, and which pages it chooses to Crawl. 

So, they feed each other.

Crawling & Indexing JavaScript

When it comes to the question of whether Googlebot can crawl and index JavaScript, we have to keep in consideration the two separate processes of the Crawler and Indexer. 

At the end of the day, the short answer is that Google can and will crawl and index JavaScript pages. However, it is not as straightforward as that. 

To understand this further, we must separate the questions: 

  • Can Google crawl JavaScript? No.
  • Can Google index JavaScript? Yes. 

This is because Googlebot (i.e. the Crawler) can only really handle HTML and CSS – traditionally built pages and code. It must rely on Caffeine (i.e. the Indexer) to actually render the JavaScript before it can crawl your URLs and send them back to the Indexer to be prioritized and evaluated. 

This process is outlined in Google’s two-wave process for JS rendering and indexing, as shown below: 

1: thesempost.com

Overall, this makes the process of crawling and indexing a JavaScript site extremely inefficient and slow.

This is because on JavaScript sites (which use client-side rendering vs. server-side rendering), most (or all) internal links are not actually part of the HTML source code – what is handed to the Crawler initially, is mostly a blank HTML document with a large JS bundle (which takes a long time to download). 

Possible Implications

So, looking at Google’s model, in CSR there’s nothing for Google to index in the source code during its first wave. And the second wave may occur hours or even a few weeks later, leaving you at risk for a partially indexed site.

The other risk factor is crawl budget. It’s not a new concept that Google does not have infinite patience; it only crawls a certain number of content / URLs on any given site. This is because it does not have unlimited resources. 

Since JavaScript websites add an extra layer of complexity to the process of crawling and indexing, there are inherent risks to your website, such as mismatched content priority (due to perceived lack or surplus or internal links pointing to that particular URL), or again – a partially crawled and indexed website.

Also, this is only for Google, which lucky for us has the capability to even do anything with JavaScript-based sites. Other search engines like Bing, Yahoo, DuckDuckGo, and even Baidu, do not have the same capabilities, and in most cases it’s been found that JavaScript-based pages are not even indexed, due to those search engine’s indexers not being as sophisticated or powerful (remember, rendering JavaScript requires much more electricity and processing power). 

2: moz.com

So, if there’s any consideration at all for other search engines, know that your website could be at risk for real limitations in other networks. 

Exploring Solutions

In addition to Pedro’s tweet mentioned at the beginning of this article, there have been other studies that show damage to rankings and organic traffic for sites that switch over to SPAs / other JavaScript-based technologies (like on Hulu.com for instance), and there have been others that show significant improvements when a different approach was adopted, or when JavaScript reliance was dismissed. So, we have to be careful.

The good news is that there are two main solutions which seem to help mitigate the negative implications of migrating to a JavaScript based site:

  • Isomorphic JavaScript / Isomorphic applications (sometimes called “universal applications)
  • Pre-rendering

Isomorphic JavaScript is the solution actually recommended by Google. Here’s an explanation of both solutions though, from this article:

1. Pre-renderingEssentially consists of listening and sending a pure HTML snapshot to the search engine bot when it requests your page. This ensures that the user can still enjoy the fast speeds provided by CSR, while also serving the search engines the HTML content needed to index and rank your pages.

2. Isomorphic JavaScriptRecommended by Google, this option consists of both client and search engines receiving a pre-rendered page of indexable HTML content at the initial load (essentially acting like SSR). All of the JS functionality is then layered on top of this to provide the fast client-side performance. It also works best for both users and search engine bots…

To add a bit more context:

Pre-rendering can be helpful, but are some possible pitfalls, such as having to manage and maintain another piece of software on your server, and there can sometimes be compatibility issues which cause the HTML output to be incorrect. These issues do not happen to every site and using reliable sources like https://prerender.io/ can help; however, it is something to keep in mind.

Isomorphic applications are considered the best of both worlds – the crawlability and indexability of HTML, with the speed of JavaScript. It allows the Crawler to see the same output the browser sees due to the content being rendered and available when the search engine accesses the page. React JS is a framework that supports Isomorphic approach, so as long as your engineers have the skillset and bandwidth, I’d recommend this solution if possible. It is known to be the best for SEO purposes.

In general, even if the preferred Isomorphic JavaScript is not able to be fully implemented, I believe that the best solution should be a mix of SSR and CSR. This can allow for the initial HTML to be generated on the server while providing an interactive experience to the user. In other words, ideally there would be some level of SSR involved here.

Why an SSR/CSR Hybrid Approach?

At the end of the day, the SEO game is about efficiency, and readability. So, it’s important to ensure that Googlebot and Caffeine can be highly efficient when processing your website, and that they can easily (and quickly) see the actual content, and related markup, that is on each page (through server-side rendering). 

This means taking steps so that imperative content is loaded and presented to Google within 5 seconds. This is why an approach like Isomorphic JavaScript, or at least Pre-rendering, can help preserve SEO while enabling the inherent benefits of SPAs like speed and flexibility. 

Another thing to consider when migrating your site to JS is to maintain all unique/ static URLs for pages, rather than using a pushstate method that generates a # in the URL when new content is loaded (vs just creating a new URL). Why? Because as you know, it’s important for SEO that each page have its own “real” URL which can be indexed. This allows those pages to build up authority, gain backlinks, and gives them an opportunity to rank for certain topics. 

So, all links should still include the “href” parameter so that Google can pick up those links (vs. relying solely on the onClick DOM event, which Google likely will not follow).

Other helpful tools for improving the SEO of an SPA/React Website include: 

  • React Router v4 (which will allow you to maintain an SEO-friendly URL structure for your website)
  • React Helmet (which will allow you to at least manage the metadata of a web document being served by React components. It’s described as “A document head manager for React.”)

If you’re interested in learning more and have a team of developers / programmers ready to jump in, this may also be a good resource for engineers as they seek to maximize performance and SEO through an SPA, as it outlines several technologies used in this hybrid approach: https://blog.digitalkwarts.com/server-side-rendering-with-reactjs-react-router-v4-react-helmet-and-css-modules/ 

QAing after you migrate to JS

If you’ve already migrated to JS or you’re about to, be sure you QA afterwards to ensure that your site is still able to be rendered. 

Going back to Pedro’s tweet, he left this suggestion to a question about knowing whether your JS is rendering properly: 

You could also check your outlinks to see how many show up with JavaScript enabled vs. disabled as this would indicate whether or not these are able to be detected by a basic crawler. Both Chrome and most crawling tools (like DeepCrawl or ScreamingFrog) should allow you crawl with or without JS, so this should come in handy when testing. 

So – what say ye? Do you have a love/hate relationship with JavaScript / React frameworks? Have you been impacted, or seen a site impacted, by a migration to JS? Share in the comments, or hit me up at @mhankins12 on Twitter! 

MozCon 2019 Recap: 10 SEO Tips Straight From The Experts

Ah, MozCon… a 3-day event where the coolest people in the world come together to talk about SEO, analytics, content marketing, and a little paid search. This year was especially exciting because I had the pleasure of going and representing Real Good Marketing, and got to meet my idol, Roger! 🙂

Yes I know folks, but this level of excitement is what happens when you work from home…

Anyway, in addition to hanging out with my industry friends and making a bunch of really bad search puns, I also learned a ton from some of the top talent in the digital marketing space. Since sharing is caring, I’m here to share my top 10 takeaways from MozCon 2019. 

Buckle your seat belts, because here we go.

1. People are still using Desktops

We’ve all heard about Google’s mobile-first index and heard the rumblings of how desktop is obsolete and mobile devices rule them all. However, according to the man Rand Fishkin himself (AKA the founder of Moz), actually… while mobile has overtaken desktop in terms of usage, desktop usage has…also stayed the same? Yes, that’s right. While we’re spending more time on our mobile devices because we’re constantly plugged into the matrix, we’re spending just as much time on desktops as we always have and browser-based usage is at an all-time high. 

So despite mobile devices and despite the ominous voice search, web search actually keeps growing. 

2. Zero-click searches are a big thing

In June 2019, for the first time the majority of searches resulted in zero clicks, meaning that folks are still searching just the same, but are leaving Google less often. 

In fact, according to Rand Fishkin (yes again, he was dropping knowledge bombs like whoa), Google has sent ~20% fewer organic clicks via browser searches since 2016. To put that a different way, in 2016 there were 26 organic clicks for every 1 paid click; today there are 11 organic clicks to every 1 paid click. So Google is indeed cannibalizing website’s clicks and traffic by presenting information within the app itself. 

This is why we’ve started focusing so much on SERP Feature optimization at RGM and are redefining what constitutes “success” for SEO, as appearing in SERP Features (like answer boxes, people also ask drop-downs, Knowledge Graph results, Google Maps local map pack, etc) and “position zero” is going to start becoming more and more impactful and may be the new metrics of success for organic search. 

3. E-A-T

Repeat after me, “E-A-T is not a ranking signal.” This is what Ruth Burr talked to us about and in her own charming way chided us for focusing too much on these (as I’m doing right now I suppose). 

If you’re not familiar, EAT means Expertise-Authority-Trust, and a couple years back became something that SEOs started paying a lot of attention to since it was said to be the key to winning the search game. Basically the idea is that if you can establish your business as an expert and authority in the field, and show Google that others trust you (through things like social followers and shares, backlinks, citations etc), you’d become an organic search god and make millions of dollars and retire by age 27. 

The thing Ruth brought up though, is that EAT is not a ranking signal! Instead, it’s something that’s made up of dozens of different signals that may or may not be an input to Google’s algorithm; they’re just qualities outlined in Google’s human reader guidelines. 

So to demonstrate EAT, you must continue to optimize your website and brand for both humans and machines using clear, concise, accurate content, focusing on web performance optimization (things we typically focus on here at RGM are page speed, crawlability, site architecture etc), and focus on what she calls “real company stuff” — just follow the basic rules of marketing, connect with your audience, create the best content out there, and you will inherently build up that quality. Now that’s some solid advice. 

4. Localized features are everywhere

Rob Bucci wow’d us when he said that a whopping 73% of the 1.2 million SERPs he analyzed had localized features on them like the map pack, local carousels, etc. I believe it! We’ve seen the number of “near me” and localized search terms skyrocket for our clients over the past couple of years and local SEO is becoming a bigger and more established practice. Interestingly though, Rob explained how 25% of these listings had variability between markets (i.e. locations), with 85% being the maximum variability seen across zip codes WITHIN A SINGLE MARKET.

That was mind-blowing because what that means is if you’re in Chicago and you search “dentist in chicago” or “dentist near me” you will likely see different results depending on exactly where you are in the city. But you’ll also see different results even if you’re not searching with local intent depending on which market you’re search from (like searching for “best advertising agencies” while in Chicago vs. sitting in NYC). 

So the era of personalized search continues, and yes it is possible to be ranking #1 from one marketing but if your boss searches the same term at 3am while sitting at a bar in Tokyo, no, you will probably not show up #1.

5. Keyword research is not the only way to generate content ideas

Ross Simmonds had some great tips about coming up with ideas for content that will resonate with your audience, and spoiler alert: it goes far beyond keyword research. 

He suggests expanding the content ideation process by tapping into communities and news sources to see what people are actually talking about, searching Quora or reddit to see real topics and questions, and searching for a big brand or site in your industry to see what articles they’re writing and/or people are sharing by them. This helps you understand what people are actually interested in, vs. just throwing spaghetti at a wall and seeing what sticks. 

I’d also add to this that we love using things like Answer the Public to generate ideas based on what people are curious about, using AutoSuggest drop-down on Google to see popular searches and topics you may not be thinking about, and interviewing customer service reps at your company to see what they’re constantly asked about or helping folks with. All great ideas for identifying power content!

6. Build your brand, win local search

Mary Bowling enlightened us about how “Brand is king” and that if you want to win at the local search game, you’ve got to have a great brand that Google will reward. This is because she believes that Google ranks entities now (i.e. brands) not just websites. 

So how do you build your brand for local search? Well, first of all you need to make sure that your core brand elements (like your main CTA, phone number and services) are all over your meta data and your brand listings. 

You should also think about how you want to portray your brand and company on Google My Business and optimize as such, especially when it comes to your photos (i.e. do you want potential customers seeing Randy from Accounting winning the hot dog eating championship? Or do you want them to see pictures of your product or cool office space?). When choosing photos, think about what searchers actually want to see. Another spoiler alert: these are the photos that are relevant to your core business offerings, not Randy from Accounting.

7. Say hello to Fraggles.

To be honest, this was my favorite talk of the entire conference. Cindy Krum came up on stage and just killed it with her uniquely refreshing take on mobile-first indexing and rankings, and a concept she coined called “Fraggles” (fragment + handle — more on this in a minute). 

So the first thing to mention here is that much like Mary Bowling’s talk, Cindy also backed up the idea that Google is indexing and ranking entities, not just websites — and that’s the name of the game when it comes to mobile-first indexing as Google is serving answers not websites. This includes the SERP Features listed above like People Also Ask accordions, featured snippets, Knowledge Graph results, etc. Cindy says that people want answers more than they want websites. So, we have to find a way to make ourselves show up for answers, and not just try to only rank for keywords and build website traffic. People’s behaviors are clearly changing as Google increases their in-SERP offerings, so we as marketers have to adapt.

So how do we do this? Well, she recommends taking advantage of “fraggles,” which are basically fragments of information held together by similar language and meaning. We have to create relationships between our content, our online properties, and our information that’s contained across various listings on the web. 

Overall Google wants to index more than just websites; they’re organizing the information of the world. So we have to organize our fraggles and keep up!

8. Amazing content is the only way to see benefit

Andy Crestodina is a brilliant business owner and an even more brilliant content marketer. He puts it simply by saying that digital marketing today is about building a bridge between Google and your thank-you page, and I couldn’t agree more. But how do we do this? 

Andy says that we can do this by producing killer content that’s original (like original research, original concepts and ideas, original data and diagrams, etc), and creating the best page on the internet for your topic. Ahh! So simple, yet so overlooked.

Along this line, he says that while good content is AMAZING, bad content is essentially worthless. Ouch. I guess that’s the point of having a great content strategy to guide your efforts vs. just posting to post. 

Another way to catch people’s eye is by producing collaborative content, which could include contributor quotes, expert round-ups of deep dive interviews. The great thing about this is by including others, you’re more likely to get your content shared and linked to. Genius. 

9. Simple SEO truths

Rob Ousbey brought some simple yet effective truths bombs to the table such as: 

  • Google has changed but our processes (of optimizing sites) haven’t
  • every now and then you have to re-challenge your beliefs about what works
  • the effects of changes are reversible so don’t be afraid to test things
  • Google is not the most reliable source of information so don’t be afraid to challenge them
  • The only thing that will work for your website is the thing that will work for YOUR website. No change has the same impact on two sites. 

Man, what a gem. If you’re looking for me, I’ll be out testing all the things and questioning The Google.

10.SERP Features are king

If you couldn’t tell, a common theme throughout the entire conference was talking about the prominence and importance of SERP Features. If you’re not familiar with SERP Features, here’s a screenshot (not from the conference) to give you an idea of what I’m talking about: 

And these are just to name a few! There’s actually (at the time of writing this post) about 15 or 16 available featured snippets which now show up in Google’s search results, ranging from the items shown in the screenshot above, to images and videos, sitelinks, reviews, map packs, tweets, shopping results, and more. 

So it’s no surprise that MozCon host and SEO aficionado Britney Muller ended the conference on a high note as she shared some pretty eye-opening statistics about SERP Features. Here’s a few of them: 

  • Neraly 24% of all SERPs have a featured snippet now. This is up 165% since 2016
  • Paragraphs and lists are the most popular by far, but 93% of featured snippet SERPs contain a PAA box
  • Featured Snippet boxes have approximately a 115-character limit. So if you want that “position zero” featured answer box spot, keep your answers short, sweet, and to the point!
  • Monitor which keywords you rank for in featured snippets because traffic and rankings may not matter as much as things like branding, messaging and share of voice (sound familiar?).

Whew — we made it, folks: my 10 biggest takeaways from MozCon. I hope you took something away from this post as well. 

If you want to discuss any of these changes or nerd out about SEO in general, feel free to drop me a line on Twitter @mhankins12, or shoot me an email. I’m always happy to strike up a conversation and talk digital marketing, or meet up at MozCon in years to come. 

‘Featured On’ Post: City vs. City SEO Throwdown

I recently became enamored with the travel / tourism industry after I had the opportunity to work on SEO for some clients like the City of San Jose, CA, and Miraval Life in Balance Spa. This is a fun and rewarding industry, as you get to help people discover new places and have new experiences.

That said, I decided to marry my love for SEO and the travel & tourism industry. In this “City vs. City Search Throwdown” post (it’s a contributed post for a fantastic local agency named Native Digital), I put 10 of the top U.S. travel destinations through the SEO ringer to find out which city comes out on top in terms of SEO performance.

So do yourself a favor: Sit back, have some fun, read the post, and let me know if you have any questions or comments.

May the best city win!

How to Prepare for Google’s Mobile-First Index

We’ve been anticipating this change for a while (and it still hasn’t happened yet), but Google’s mobile-first index is still getting closer.

It’s speculated that the mobile-first index will be updated more regularly than the desktop index, and that desktop content and the desktop UX will no longer help you in the mobile world. This means that if you have an abbreviated version of your desktop website that shows to mobile users, you may be in trouble because Google will no longer reference your desktop content for context clues about your brand, content, or products and services.

How can we prepare for this update?

I began exploring this topic in a guest post that I contributed at the end of 2016, when Google first confirmed that a specific mobile index would be rolled out. Five months later, we’re all still left speculating what this might look like or how exactly it’ll impact our SEO.

Luckily, Gary Illyes has since offered us a few pointers on how to prepare for the change:

  1. Make sure your mobile site has the content you want to rank for. In other words, since Google will no longer be considering your desktop-only content into which sites show up in the mobile search results, you’ll need to make sure that all important content is present on both desktop and mobile versions of your website (which shouldn’t be an issue if you’re using responsive web design).
  2. Make sure structured data are on your mobile site.  Gary didn’t give specifics on what he meant by this, but I’d have to take an educated guess and say that he’s referring to AMP markup, which is really driving the mobile content game nowadays.
  3. Make sure rel-annotations are on your mobile site. We also aren’t quite sure what this one means yet either, but Gary assured us that they would be providing more information on this in months to come.

 

Want to ensure you’re ready for the change? Give me a call and I’d be happy to take a look.

‘Featured On’ Post: How to Rank in Position Zero

Position Zero has really gotten a name for itself the past year or so as Google continues to roll expand its Featured Snippet capabilities.

What is position 0 in Google? We’ll, it’s this:

position zero seo google

The benefit? Well, clearly it’s because you’re even outranking the #1 position in terms of what the searcher sees first! This will equate to more organic search traffic, high organic CTR, and additional brand exposure / trust in general.

Pretty cool, huh?

But how do you take that coveted spot?

Read my contributed article to learn more about Featured Snippets and read step-by-step how you, too, can rank in position zero on Google.

Then, feel free to hit me up so we can work together on getting you there.

‘Featured On’ Post: Can Social Media Impact SEO?

I always get mildly confused look when I tell a client that part of my SEO strategy includes social media.

Why is this?

Well, to many people’s surprise, the disciplines of SEO and Social Media are not as separate as one might think. In fact, Search Engine Land even has an entire section dedicated to Social in its 2017 Periodic Table of SEO Success Factors.

Social media is important to SEO because ultimately, Google’s algorithm is run by authority and trust. In other words, Google pays a great deal of attention to which brands and websites are getting positive attention, and which brands are not. And what better way to determine this than to turn to social media, where people are free to discuss, link to, and share about, your brand.

In that way, social indicators (like number of followers, engagements and activity with your brand, etc.) are loosely correlated to your SEO performance.

For a further analysis of whether or not social impacts your SEO, check out the blog post I wrote for MBB+, an advertising agency located here in Kansas City Suburbia. Then, feel free to contact me to set up a time to talk about how you can get your social (and SEO) on.

5 Local SEO Tips (and how to show up in the 3-pack)

Did you know that more than 80% of people use search engines to find local business information? And that 50% of those performing a local search actually visit the business within a day?

Well, it’s true!

As such, it’s important that you work as a business to put your location(s) on the map with local SEO.

As fate may have it, as Google becomes more advanced, so does the practice of local SEO. In my guest post for Tentacle Inbound, an inbound marketing firm, I walk through 5 simple local SEO tactics you can do as a business owner to help show up for those 80% of folks performing local searches.

But because I like you guys, I also want to give you some bonus information about how to show up in what’s called the “Local 3-pack.” This is important because nearly 45% of all clicks now go to businesses who show up in this coveted real estate.

 

How to Get Your Business in Google’s Local Three-Pack:

local 3 pack google listings

In essence, there are three main things Google looks at when determining which businesses to display for local search queries like the one shown above:

1) Relevance is how closely the business’s offerings match up with the search query. This is really driven by how your business listing is optimized (title, description, etc) and the categories chosen to mark up your business, so be sure to perform local listing optimization to help out with this. Additionally, even the keywords on your website(s) are considered part of relevancy, so optimizing your main .com is also important.
2) Distance is basically just proximity to the searcher, as this is obviously a determining factor of what shows up in Google Maps. There’s not much you can do to influence this factor, except make sure that your addresses are all up-to-date so that Google can see how close you are to the searcher.
3) Prominence is essentially Google’s term for “brand authority / trust.” Aside from relevance, this is the factor we can really impact with our SEO efforts, as prominence takes into account things like number of backlinks, ratings and reviews, quality and quantity of keywords that your brand is ranking for, domain authority, and website traffic volume. This calls for a well-rounded SEO strategy, which a consultant or agency can definitely help you with.
So in general, influencing those local search results seems to be a combination of local listing optimization and general website optimization.

‘Featured On’ Post: Understanding Google’s RankBrain

If you’re a small business and you’re interested in taking your SEO to the next level, then it’s time we talk about RankBrain.

What is RankBrain? Well, it’s Google’s machine learning algorithm, which keeps getting smarter and smarter by the day. Today, it’s estimated that about 15% of all queries are powered by the AI algorithm, which interprets a searcher’s language and intent to serve up the best organic results possible.

In my guest post for Tentacle Inbound, an inbound marketing agency, I dive deeper into what RankBrain is, why it matters, and how we can optimize for it.

Read the post to learn more about RankBrain.