Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Technical SEO Trends in 2022 and Beyond

Technical SEO Trends in 2022 and Beyond

Slides from my talk at inOrbit 2022 where I spoke about the current state of technical SEO and where I see things heading in the near future.

Barry Adams

May 17, 2022
Tweet

More Decks by Barry Adams

Other Decks in Marketing & SEO

Transcript

  1. Crawling: Googlebot • URL discovery; ➢ <href> tags in HTML

    ➢ XML sitemaps ➢ Other sources? • Crawl queue management; ➢ De-duplication based on URL patterns ➢ Crawl prioritisation & scheduling • Crawling; ➢ Fetching raw HTML ➢ Crawl ‘politeness’
  2. Don't use robots.txt to temporarily reallocate crawl budget for other

    pages; use robots.txt to block pages or resources that you don't want Google to crawl at all. Google won't shift this newly available crawl budget to other pages unless Google is already hitting your site's serving limit.
  3. Robots.txt prevents crawling… … but not indexing! • Links on

    webpages to blocked URLs are still crawled • Their anchor texts carry relevancy for indexing
  4. Crawl Management • Canonicals & noindex are NOT crawl management;

    ➢ Google needs to see meta tags before it can act on them ➢ That means Googlebot still crawls those URLs
  5. Optimise Crawling • Serve correct HTTP status codes; ➢ 200

    OK ➢ 301 / 302 Redirects ➢ 304 Not Modified ➢ 401 / 403 Permission Issues ➢ 404 / 410 Not Found/Gone ➢ 5xx Error
  6. Optimise Crawling • ALL resources consume crawl budget; ➢ Not

    just HTML pages ➢ Reduce HTTP requests per page
  7. Optimise Crawling • ALL resources consume crawl budget; ➢ Not

    just HTML pages ➢ Reduce HTTP requests per page • AdsBot can consume crawl budget; ➢ Double-check your Google Ads campaigns
  8. Optimise Crawling • ALL resources consume crawl budget; ➢ Not

    just HTML pages ➢ Reduce HTTP requests per page • AdsBot can consume crawl budget; ➢ Double-check your Google Ads campaigns • Link equity (PageRank) impacts crawl budget; ➢ More link equity = more crawl budget
  9. ?

  10. Two Stages* of Indexing Crawler Indexer Ranker 1 2 *At

    least – indexing is a collection of interconnected processes
  11. Indexing • HTML lexer; ➢ Cleaning & tokenising the HTML

    • Index selection; ➢ De-duping prior to indexing • Indexing; ➢ First-pass based on HTML ➢ Potential rendering (not guaranteed) • Index integrity; ➢ Canonicalisation & de-duplication
  12. Rendering Issues • JavaScript inserts invalid HTML in the <head>;

    ➢ <body> tags in the <head> break Google’s processing of meta tags
  13. Optimise Indexing • Don’t rely on Google’s rendering; ➢ Use

    SSR & CDN caching • Minimise page weight; ➢ Fewer page resources = better use of crawl budget faster load speed & CWV less chance of rendering issues • Improve internal linking; ➢ More PageRank = higher chance of indexing
  14. Edge SEO • CDNs store cached versions of your webpages;

    ➢ Global coverage with edge nodes worldwide ➢ Usually also results in faster crawling and better CWV • You manipulate your CDN cached pages; ➢ Cloud Workers enable a range of functionality • Googlebot crawls the changed CDN-cached pages; ➢ Your ‘original’ website remains unchanged ➢ Google only sees the changed CDN webpages
  15. Why Edge SEO? • Faster deployment; ➢ Bypass your developers’

    lengthy queues ➢ ‘Ask forgiveness, not permission’ • No CMS constraints; ➢ Change pages directly regardless of your CMS capabilities • Testing; ➢ Perform narrow tests on specific site sections ➢ A/B testing for SEO
  16. Barry Adams ➢ Doing SEO since 1998 ➢ Specialist in

    Technical SEO & News SEO ➢ Newsletter: SEOforGoogleNews.com