Upgrade to Pro — share decks privately, control downloads, hide ads and more …

From Page to Profit: Mastering SEO Indexation for E-commerce Success

From Page to Profit: Mastering SEO Indexation for E-commerce Success

How to optimise crawlability and indexation to produce more revenue for e-commerce brands.

Ryan Hassel

September 19, 2023
Tweet

Other Decks in Marketing & SEO

Transcript

  1. From Page to Profit:
    Mastering SEO Indexation
    for E-commerce Success
    Ryan Hassel
    Tug
    @tugagency

    View Slide

  2. Performance digital since 2006

    View Slide

  3. London | Berlin | New York | Toronto | Sydney | Singapore
    100+
    digital experts
    globally, with
    local expertise
    media managed
    internationally
    $200M+
    E-commerce
    Experience
    Since 2009

    View Slide

  4. What is Indexation?

    View Slide

  5. The Crawling, Indexation & Search Process
    To understand indexation, we must know the whole journey
    Google Bot Mobile
    Google Servers
    Newly crawled
    pages are indexed
    & added to server
    if allowed
    Browser
    Algorithms act
    as filter and
    show relevant
    results in index
    User searches &
    request sent to
    Google server…
    Relevant
    results listed
    in browser

    View Slide

  6. Why is Indexation Important?

    View Slide

  7. Because it’s the Reason We’re All Here
    Indexation of e-commerce brand pages can lead to KPI success
    More Organic
    Traffic
    Better Site
    Authority
    More Organic
    Revenue
    Better User
    Experience
    More SERP
    Presence
    Advantages Over
    Competitors

    View Slide

  8. When Managed Incorrectly, This Happens
    Correlation between revenue decreasing & non-indexation
    increasing

    View Slide

  9. Pagination
    Issues
    Sitemap
    Issues
    Canonical
    Issues
    Redirect
    Issues
    4xx Errors
    Robots.txt
    Issues
    Robots
    Directive
    Issues
    Internal
    Link Issues
    Technical Issues Are The Main Culprits
    The following SEO issues hinder a website's crawlability and
    indexability

    View Slide

  10. How do we Optimise Indexation?

    View Slide

  11. Remove 404 Pages & Low Value OOS Pages
    This will save crawl budget & alleviate issue of Google
    perceiving website as low quality
    Remove “sold out” pages.
    Redirect broken pages to most relevant
    page.

    View Slide

  12. Create Optimised Internal Linking Strategy
    Internal linking passes ranking authority & aids crawlers

    View Slide

  13. Connect Orphaned Pages Back to Website
    Pages with no internal links are harder to find
    No links from main domain
    page(s) for crawlers to use
    to navigate to the page.

    View Slide

  14. Avoid Pagination & Facets Diluting Structure
    Avoid creating too many pages, which impacts crawl budget
    Current Page
    User Input
    Back to
    Page 1
    Back 1
    Page
    No. of
    Articles/Products

    View Slide

  15. Be Wary of Crawl Depth in Site Hierarchy
    Crawl depth > 4 can lead to lack of website page visibility
    Crawl Depth 1
    Crawl Depth 2
    Crawl Depth 3
    Crawl Depth 4
    Crawl Depth 5
    Add
    category &
    product
    pages near
    root domain

    View Slide

  16. Create an Optimised and Concise Sitemap
    Only include important and working pages in sitemap(s)
    Do:
    ➔ Include important
    pages like category &
    product.
    ➔ Have < 50,000 URLs
    per sitemap.
    ➔ Include 200 code
    indexed pages.
    ➔ Group sitemap(s) by
    category, language,
    site area, etc.
    Don't:
    ➔ Include low value
    pages (T&C’s, etc.)
    ➔ Have > 50,000 URLs
    per sitemap
    ➔ Include non-indexed
    pages (e.g.
    canonicalised,
    redirected, noindex tag,
    etc.)

    View Slide

  17. Respect the Robots Directive Tags
    Address important product pages marked with incorrect tags

    Directives
    Directives Key:
    Index - Allows page to be
    indexed.
    Follow - Allows links on page
    to be crawled.
    NoFollow - Tells crawlers not
    to crawl links on the page.
    NoIndex - Tells crawlers not to
    index the page.

    View Slide

  18. Remove Duplicate Content Using Canonicals
    Duplicate content impacts crawl budget & indexation
    URL: https://www.endclothing.com/gb
    Canonical: https://www.endclothing.com/gb
    URL: https://www.endclothing.com/gb/index.html
    Current Canonical:
    https://www.endclothing.com/gb/index.html
    New Canonical: https://www.endclothing.com/gb

    View Slide

  19. We’ve contradicted ourselves and confused
    search engines here by allowing crawling of any
    .php$ resources and also disallowing crawling of
    any .php$ resources.
    Correct types of pages to disallow from
    Googlebot(s).
    Tidy Your Robots.txt Instruction File
    Check we’re not disallowing any important pages or areas

    View Slide

  20. Make Your Temporary Redirects Permanent
    Temporary redirect pages are still used by Google for ranking,
    impacting indexation & ranking of new pages
    HTTP://
    HTTP://
    HTTP://
    HTTPS://
    HTTPS://
    HTTPS://
    HTTP:// HTTPS://
    302 Temporary Redirect
    307 Temporary Redirect
    301 Permanent Redirect
    308 Permanent Redirect

    View Slide

  21. Banner pulled from
    CDN, which is auto
    compressed and
    alleviates any
    rendering/speed/size
    issues.
    Product images are
    pulled from
    different CDN,
    which doesn’t auto
    compress.
    Use CDN & CMS Auto Compression Features
    Next.js (REACT
    Framework to create
    web applications)
    isn’t
    autocompressing
    using gzip, brotli or
    deflate.
    To combat Core Web Vital (CWV) issues such as LCP

    View Slide

  22. END. vs
    Competitors
    END. Farfetch USC Flannels
    No. of
    JavaScript Files
    103 98 34 44
    Improve Product Page Speed Further
    Removing unused JS will speed up page loading times
    END. vs
    Competitors
    END. Farfetch USC Flannels
    No. of
    JavaScript Files
    92 75 34 37
    PDP
    PLP

    View Slide

  23. Server Side Rendering the Navigation
    Will improve page speed and increase Googles chances of
    caching and crawling our navigation links
    Googlebot mobile struggling to
    access links in menu.
    SSR will improve chances of
    caching and speed up the process in
    which content is loaded/rendered.

    View Slide

  24. Tracking
    resources
    loaded last
    e.g. GTM
    Above the fold
    images
    rendered first.
    Render Important Page Resources First
    This will allow Google to quickly find important page content
    Sometimes this
    isn’t possible,
    so SSR would
    help preload
    resources
    above images
    (e.g. JS)

    View Slide

  25. Use CMS & CDN features to automate the optimisation on sitewide scale
    Redirect
    404’s &
    Delete
    OOS &
    Unused JS
    Product
    Pages
    Near Root
    Domain
    Internal
    Link to
    Orphaned
    Pages
    Avoid
    Blocking
    Robots
    Tags
    Optimised
    Internal
    Linking
    Strategy
    Create
    Optimised
    & Concise
    Sitemap
    Avoid
    Temporary
    Redirects
    &
    Pagination
    Remove
    Duplicate
    Content
    Summarising Our Technical SEO Fixes
    Optimise
    Robots.txt
    Compress
    & SSR
    Elements

    View Slide

  26. How do we Check Indexation?

    View Slide

  27. We Have High Level & Evergreen Checks (1/2)
    Site searches can be used for indexation checks
    Shows us if the URL is indexed and in Google…
    Shows us what URL’s on the domain are indexed and ranking for the term “Hello World”…
    Shows us if the page is indexed and in Google for the term “Hello World”…

    View Slide

  28. We Have High Level & Evergreen Checks (2/2)
    GSC is better and will notify us around indexation status
    through inspection

    View Slide

  29. As Well As Prerequisite Checks (1/3)
    By checking Google’s cache, we identify how visible URLs are
    1
    2
    3
    Navigation is heavily JavaScript and hasn’t been cached by
    Google. This issue leads to less navigation URLs becoming
    less visible, impacting crawling & indexation.

    View Slide

  30. As Well As Prerequisite Checks (2/3)
    Large elements on a page can also reduce link visibility
    Banner image too large &
    potentially blocking
    visibility of other page
    elements.

    View Slide

  31. As Well As Prerequisite Checks (3/3)
    Utilise crawlers available to identify blockers
    Crawlers Blockers
    x-robots-tag: noindex, nofollow,
    nosnippets
    Redirects, Canonicals, Error Pages
    Crawling: Sitemaps, Robots.txt,
    Orphaned Pages, Crawl Depth, etc.

    View Slide

  32. Log File Analysis Shows Google Interaction
    Analysis using SF tool identifies how search engines engage
    with our domain pages

    View Slide

  33. How Indexation Enhancements benefited
    our clients

    View Slide

  34. Introduction to END. Clothing
    A little context around our client before we dive into results
    END. Clothing is a global
    fashion retailer selling
    designer brands
    END. Clothing is JavaScript
    heavy & struggles with being
    crawled
    Tug identified crawling
    blocker as reason for lack of
    indexability

    View Slide

  35. Case Study: Increase in Facet Indexation
    UK Facet page indexation increased by 64% MoM
    Validation passed: Google
    resolved canonical issue
    There was a significant
    fix on 29th November,
    resolving a duplicate
    issue where there was a
    large volume of pages
    without user-selected
    canonical.

    View Slide

  36. Case Study: Increase in Indexation After
    Temporary Redirect Fixes
    28% increase in events to pages between April & May 2023
    Site speed
    & CWV
    Fixes
    Temporary
    Redirect
    Fixes

    View Slide

  37. Case Study: Post Migration Technical Audit
    Technical fixes increased revenue figure by 118%
    Source: https://www.goinflow.com/blog/technical-seo-case-study/
    Robots.txt
    Disavow Fixes
    Sitemap
    Fixes/Creation
    Canonical Tag
    Fixes
    Applying Appropriate
    Robots Directives
    +118%
    Revenue
    +19%
    Users
    +19%
    Sessions

    View Slide

  38. 3 Key Points to Remember

    View Slide

  39. 3 Key Points to Remember
    Basic technical
    checks and fixes
    still yield
    positive results
    Use 3rd party
    tools to analyse
    Google
    interactions
    Employ
    platform
    features for
    sitewide
    changes

    View Slide

  40. Smart Combination of Data, Media,
    Content, and Technology
    Performance Digital
    Thank You!

    View Slide