The technical SEO hierarchy of needs - Search Engine Land

What makes a site become the best site it can be? Healthy, functional sites that have reached their full SEO potential have been optimized based on market and keyword research, E-A-T, content relevance to search intent, backlink profiles, and more. But they all have one thing in common: their technical SEO needs are met.

Your site's technical SEO needs form a hierarchy. If needs lower in the hierarchy aren't met, needs on the next level are difficult to fulfill. Each level responds to a different requirement in the world of search engines: crawlability, indexability, accessibility, rankability, and clickability.

Understanding what each level of the pyramid involves helps make technical SEO look less intimidating without oversimplifying its role in making a website great.

The foundations of technical SEO: crawlability

At the foundation of the pyramid of technical SEO needs is a URL's crawlability.

Crawlability concerns a URL's ability to be discovered by search engine bots. URLs that are not crawlable might still be accessible to users navigating your website, but because they are invisible to bots, they can't appear in search results.

Crawlable URLs, therefore, are:

  • Known to search engines. Search engines discover URLs by crawling links and reading sitemaps.
  • Not forbidden to bots. Most search engine bots will respect meta robots instructions and directives in a robots.txt file that ask them not to crawl certain pages or directories.
  • Covered by the website's crawl budget. Less commonly, the "budget" accorded by Google's algorithms is spent on other parts of a site, causing delays or problems in getting a specific URL crawled.

The first step in a technical SEO audit, for example, is to uncover pages that can't be indexed, and why. Sometimes this is intentional, and sometimes it's an error and a quick win for SEO.

Similarly, while crawl budget may seem esoteric and difficult to quantify, the basic principle is that when the cost of crawling is optimized and when priority pages are presented first, more traffic can be gained through search engines. Technical SEO uses how pages are discovered and prioritized to promote better crawling; it leverages historical data for crawl frequency and past situations that provoke increased crawling activity to improve current crawl rates.

Newly crawled pages distribution by page groups. The 'Other' grey category being the garbage category. A lot of crawl budget has been wasted crawling those pages. Source: OnCrawl.

Indexability

Just above crawlability in the hierarchy of technical SEO needs is indexability.

Indexable URLs are URLs that a search engine can include in a catalog of pages that are available to be presented in search results pages. Even when a URL has been crawled, various properties can prevent it from being added to the index.

In the most straightforward situations, pages can be prevented from being indexed by meta robots and robots.txt directives.

State of indexability by strategic groups of pages. Source: OnCrawl

But Google also chooses not to index pages when a more authoritative version exists for the same content. This is the case when a bot discovers the following elements:

  • Duplicate content.
  • Canonical declarations.
  • Alternate versions such as printable pages or mobile pages. (In the current move to a mobile-first index, mobile versions are indexed instead of desktop versions.)
  • Redirections.

To ensure that the right pages can be indexed, technical SEO verifies that these elements are correctly set up and that they apply to the correct pages.

Accessibility and website performance

An accessible URL is easy to display or render.

A URL that is both crawlable and indexable might still be inaccessible at the moment when a search engine's bot attempts to crawl it. Pages and sites that rank but that have persistent accessibility problems are often penalized in the search results.

Accessibility for bots — and for users — covers a broad range of related topics:

  • Server performance.
  • HTTP status.
  • Load time/page size.
  • JavaScript rendering.
  • Page depth in the site architecture.
  • Orphan pages.
  • Website resistance to spam and hacking.

The goal is to discover the threshold at which accessibility and performance metrics negatively impact SEO performance, and to ensure that all pages of a website meet at least that minimum level. Technical SEO, therefore, uses tools to measure anything from server downtime or HTTP status served to bots and users, to the size of resources (CSS, JS, images…) transferred when a page is requested or load time metrics such as TTFB, FCP, or TTLB.

Average response time between desktop and mobile bots and resources encountered. Source: OnCrawl

Technical SEO audits that conclude you need links to certain pages are often working to eliminate underperforming orphan pages and URLs with excessive page depth. Some will include accessibility for users; a page that does not work with a screen reader cannot be used by many users, no matter how great its content or keyword optimization.

Once accessibility issues have been addressed, we can say that the basic technical SEO needs of a page are met. Without them, page and website SEO suffer. As we continue to move further up the hierarchy of needs, we pass from blocking factors to factors of improvement.

Rankability: the role of technical SEO in improving positions

Rankability is the first of the two top levels of the pyramid that deal with optimizations. Instead of forming the foundations of SEO, they are sometimes considered advanced technical SEO.

Clearly, crawlable, indexable and accessible URLs can rank. Some can even rank well. However, the average URL will rank better with a little help.

Using links to boost rankings

Linking, whether internal or external, transfers page importance (and traffic!) from popular pages to less popular pages. This second group profits. Technical SEO strategies will, therefore, examine backlinks to determine the most advantageous profile, or use internal linking structures to promote pages.

Not only can internal links improve crawl rate (by reinforcing freshness when linking from new or updated content) and conversion (by funneling users towards high-converting and goal pages), but they also transfer page importance and help build content silos, two strategies for improving page rank.

Distribution of SEO traffic regarding the number of inlinks by page. After 50 inlinks, the activeness of a page is increased. Source: OnCrawl

Improving positions with semantic optimization

Content silos, created by interlinking semantically related content, help groups of pages rank better than a single page could. They build both depth and expertise while expanding keyword reach with pages that focus on long-tail keywords and semantically related concepts.

In some cases, it can also be worthwhile to look at the pertinence of a page with regard to the rest of the site, examine keyword density, number of words, text-to-code ratio, and other factors that can be either red flags or content quality indicators for a given keyword group.

Clickability: the link between SEO and user behavior

The final level of technical SEO optimization concerns technical elements that make it more likely for a user to click on your results.

Because of how search engines present results, this can include earning coveted SERP locations outside of the normal organic results order and enriching your URL listings.

Content structure, such as lists, tables, and headings, help search engines understand your content and facilitate dynamic creation of featured results, carousels and more.

Similarly, formal structured data, including Schema.org markup, enhance search listings with rich elements:

  • Breadcrumbs.
  • Star ratings.
  • Product information (price, stock…).
  • Event information (date, location…).
  • Recipe information (thumbnail, rating, preparation time, calories…).
  • Site links to key pages on the same site.
  • Site search from the SERP.

Likewise, videos and images with appropriate markup have an advantage in image and video search.

Relevance to search intent and content uniqueness draw users. While these remain abstract concepts, the technical tools to analyze and improve them are emerging. Techniques such as machine learning can be applied to search intent and user click behavior, while content creation aids such as AI are intended to facilitate the creation of new content.

OnCrawl shows the Impact of rich data on CTR

In the meantime, technical SEO aims to use technical means to spot and signal potential discrepancies in search intent or duplicate content through similarity analysis.

Finally, technical SEO analyzes user behavior data combined with website characteristics in order to discover correlations. The objective is to create more of the situations in which your website draws users. This strategy can uncover surprising correlations between page or website structure and user-based metrics like bounce rate, time on site or CTR.

Implementing technical improvements

You don't need a technical background to understand or to meet the critical needs at the bottom of the technical SEO hierarchy.

If there are issues that keep your site from being crawled, indexed, ranked, or clicked, SEO efforts in other areas won't be as effective. Spotting and resolving these issues is the role of technical SEO. Solutions like OnCrawl will help you understand where to start with actionable dashboards and reports combining your content, log files and search data at scale.

Repartition between pages in structure, crawled pages, ranking pages and active pages. Source: OnCrawl

Where does your site fall on the hierarchy of technical SEO needs?

About The Author

OnCrawl is an award-winning technical SEO platform that helps you make smarter SEO decisions. OnCrawl combines your content, log files and search data at scale so that you can open Google's blackbox and build a SEO strategy with confidence. Backed by a SEO crawler, a log analyzer and third-party integrations, OnCrawl currently works with over 800 clients in 66 countries including e-commerce websites, online publishers and agencies. OnCrawl produces actionable dashboards and reports to support your entire search engine optimization process and helps you improve your positions, traffic and revenues. Learn more about us at https://www.oncrawl.com.

Comments

ads T

Popular posts from this blog

B.C. documentary about Chinese family’s search for answers nominated for Canadian Screen Award - Global News

8 best dark web search engines for 2021 - HackRead