Upgrade to Pro — share decks privately, control downloads, hide ads and more …

SEO in the Age of JavaScript: Making SPAs Searc...

SEO in the Age of JavaScript: Making SPAs Search-Friendly

Modern frameworks like React and Angular power today’s most dynamic digital experiences—but they also introduce serious SEO pitfalls if not handled correctly. In this in-depth talk, Panos Kondylis (Co-Founder of Fussion) demystifies the SEO challenges posed by Single Page Applications (SPAs) and JavaScript-heavy websites.

Learn why client-side rendering breaks indexing, what Google really sees during the rendering phase, and how to solve visibility issues through SSR, SSG, hybrid rendering, and intelligent prerendering strategies. Backed by real-world case studies, Panos shares actionable insights on recovering lost rankings, improving crawlability, and aligning developers and SEOs for long-term success.

Whether you're managing a migration or scaling an eCommerce platform, this talk is a must-read for anyone navigating SEO in the JavaScript era.

Avatar for Athens SEO

Athens SEO

May 30, 2025
Tweet

More Decks by Athens SEO

Other Decks in Marketing & SEO

Transcript

  1. SEO in the Age of JavaScript: Making SPAs Search-Friendly PANOS

    KONDYLIS, Co-Founder, Fussion 24 MAY 2025 speakerdeck.com/pkondylis linkedin.com/in/kondilis/ panosko
  2. About Me Panos Kondylis Co-Founder of Fussion 22+ years of

    experience in DM & SEO Married, father of Love for , & SEO “The Wire” is the best TV ever made
  3. About Fussion We Engineer Growth. Not Just Marketing. Fussion is

    your performance partner, scaling your brand with data, UX, and relentless CRO. Built for Performance-Driven Brands Born from Digital Growth Veterans Clients from eCommerce, SaaS, Media & Marketplaces worldwide $24M+ in Ad Spend Managed Across Channels Average 80% ROAS Increase in 12 Months
  4. What we’ll Discuss Today Why SPAs break SEO How to

    fix it (properly) Real-world results 01 02 03
  5. Great for users smooth transitions, fast interactions, app-like feel Bad

    for search engines no HTML content, no real links, no metadata SPAs: Fast UX, Poor Crawlability
  6. Why SPAs are SEO nightmare No content in the initial

    HTML. Googlebot lands on an empty shell — nothing to index What bots see What users see
  7. Rendering & Indexing Takes More Time Google doesn’t “see” JavaScript

    content instantly. The reality of JS SEO: • Crawl: Googlebot sees only the raw HTML • Render: Happens later — when resources allow • Index: JS content joins the index in a second wave “The idea of ‘second wave of indexing’ isn’t a theory. It’s how it works.”
  8. Google acknowledges it too “Google processes JavaScript web apps in

    three main phases: Crawling, Rendering, Indexing.” (Source: Google Search Central – JavaScript SEO Basics)
  9. Metadata loads too late (or never) Meta titles & descriptions

    are injected via JavaScript after crawl. But reality shows, this might not happen! Yes, a generic non-sense meta title can appear in the Google’s index No easy to rank on “Laptops for Gamers”, right?
  10. This is disappointing, right? • Irrelevant or outdated titles in

    SERPs • No control over how your brand is presented • Lost clicks, even if you’re ranking
  11. Canonical Tags are not always respected This is a category

    url (PLP) but, Google selected a totally different taxonomy (!!). A product page, from a totally different category. This PLP will never get indexed
  12. Discovery of new links is difficult Googlebot will take too

    much time to discover or revisit our urls
  13. Indexation is a disaster • Only ~10% of pages indexed.

    • The rest? Lost visibility. • "Crawled – not indexed" = thin content or rendering issues
  14. SPAs Often Fail at HTTP Status Codes Many SPAs show

    a “Not Found” page in the browser, but still return an HTTP 200 status code — which confuses search engines and leads to soft 404s. (Proper implementation) (Wrong implementation)
  15. 14% loss in top 20 rankings too big to fail?

    → things can recover (not fully & it will take months) no prerender, no SSR
  16. JavaScript-Heavy Pages Hurt SEO Large eCommerce site where Product &

    Category pages relied on client-side JS 35% growth after moving critical content to initial HTML — no JS used to deliver content. (until You Fix Rendering)
  17. Rendering Options for SEO Rendering Model Description SEO Friendliness Best

    Use Case CSR (Client-Side Rendering) Content loads via JavaScript after the initial page load Weak Web apps prioritizing interactivity SSR (Server-Side Rendering) Content is rendered on the server and sent as full HTML Strong Dynamic websites needing real-time updates Prerendering Static HTML snapshots generated for bots Strong SPAs with stable content SSG (Static Site Generation) HTML is built at build time and served instantly Strong Blogs, docs, marketing sites Hybrid / Edge Rendering Combines SSR, SSG, and edge caching for optimized delivery Strong Complex apps needing flexibility & speed
  18. SSR: Real-Time HTML That Bots Love SSR renders the full

    HTML on the server before it's sent to the user. Googlebot receives a complete, indexable page — instantly. • HTML generated per request on the server. • Crawlers see everything immediately. • Ideal for dynamic content (i.e, eCommerce, SaaS). • Needs Node.js or similar environment (e.g., Next.js, Nuxt). Generates HTML at request time SSR Server Rendered HTML Request page
  19. Server Side Rendering (SSR) Pros Cons Content available in initial

    HTML Slower time to first byte (TTFB) Ideal for dynamic content (e.g., eCommerce) Requires server infrastructure Compatible with metadata & schema More complex dev setup Googlebot sees full content immediately Can strain servers under high traffic
  20. Static Site Generation (SSG) With SSG, every page is rendered

    at build time, producing ultra-fast, crawlable HTML — with near-zero server load. • Pages are rendered during deployment — not on request. • Content is served from CDN (extremely fast). • Ideal for content-heavy or blog-style sites. • Clean source HTML, instant rendering.
  21. Static Site Generation (SSG) Pros Cons Super fast page loads

    Rebuilds needed for content updates Fully crawlable HTML Not ideal for real-time data or user auth No server required (CDN friendly) More dev overhead for content-heavy sites Great for blogs, docs, landing pages Can’t personalize easily per user
  22. SSG vs SSR — Key Differences Feature SSG (Static Site

    Generation) SSR (Server-Side Rendering) When HTML is generated At build time (before request) At request time (on demand) Where it's generated Once, during deployment On the server, per request Performance Blazing fast (CDN-served static files) Fast, but slower TTFB (server computes HTML) SEO friendliness Excellent (fully rendered HTML) Excellent (if fast enough) Content freshness Needs rebuild for updates Always up to date Ideal use cases Blogs, docs, marketing sites Dynamic content, user-specific pages, eCommerce Scalability Highly scalable (served from CDN) Scales with infra effort (server load)
  23. Hybrid / Edge Rendering Hybrid frameworks allow you to mix

    SSG, SSR, and edge caching — optimizing both SEO and user experience at scale. • Per-route or per-component rendering control. • SSG for stable pages, SSR for dynamic ones, edge caching for global delivery. • Personalization & SEO in one framework.
  24. Hybrid / Edge Rendering Pros Cons Best of SSR +

    SSG + CDN Can increase build & infra complexity Smart caching at edge improves speed Requires modern frameworks (e.g., Next.js, Nuxt) Dynamic rendering without SEO loss DevOps complexity Works great for multi-region delivery Higher technical barrier
  25. Prerendering Serves pre-built (cached) HTML snapshots to bots • A

    bot detection layer identifies Googlebot and sends a static snapshot. • The snapshot is generated in advance (pages are re-cached every X days i.e). • Great for small-to-medium SPAs where content doesn’t change often.
  26. Prerendering Pros Cons Full HTML delivered to crawlers Not suited

    for highly dynamic pages Easy to implement with tools (e.g., Rendertron, Prerender.io) May miss real-time content needs Solves indexing delays in SPAs (canonical, metadata etc are within the initial html) Separate rendering path for bots vs users Lightweight on server Can be tricky to scale Cache HITs < 100 ms Cost might be an issue, to re-render often
  27. Use SSR whenever possible. It sends fully rendered HTML to

    search engines and users — perfect for SEO, speed, and personalization. If you're using frameworks like Next.js or Next, enable SSR for all SEO-critical pages. Use prerendering only when: • SSR isn’t feasible yet • You're adding SEO to a legacy SPA • Content is static and user-agnostic Bottom line: SSR scales better for both SEO and user experience.
  28. ▪ Delayed Re-caching = Outdated Content Prerendered pages may become

    stale if not refreshed frequently — causing Google to index old or incorrect information. ▪ Bots and Users See Different Content If your prerendered snapshot differs from the live user experience, Google may flag the site for cloaking or inconsistency. ▪ Failed Prerenders can harm us If tags like <title>, <meta name="description">, or <link rel="canonical"> are injected late via JavaScript, they may not appear in the snapshot — or be missed by Googlebot. What can go wrong? ▪ Routing or Link State Bugs SPAs using client-side routers may fail to prerender internal navigation paths correctly — breaking crawl paths. ▪ Missed JavaScript Errors If your prerendering tool crashes due to broken scripts or async content, you may silently ship blank or broken HTML pages. ▪ Recaching Limitations Can Hurt SEO If DevOps can’t refresh pages fast enough, titles, descriptions, and product info can go outdated — costing visibility.
  29. Having your devs by your side is a great strategy

    Troubleshooting your SPA configuration is a ongoing job. Anything can break, so beware, to fix it fast & smoothly