Back to Blog

How Your JavaScript Framework Affects SEO Rankings

·Xyle Team
javascript seoframework seossrcsrnextjs seoreact seo

Your JavaScript framework choice has a direct impact on whether search engines can index your content. Google can render JavaScript — but with delays, resource limits, and no guarantee that every page gets rendered. Other search engines and AI answer engines often cannot render JavaScript at all.

If your site ships an empty <div id="root"></div> and relies on client-side JavaScript to fill it in, you are making a bet that every crawler will execute your bundle correctly. That is a bet you will lose.

The Rendering Problem

Search engine crawlers work in two phases. First, they fetch the HTML response from your server. Second, they optionally render JavaScript to see the final DOM. The problem is that phase two is expensive, delayed, and inconsistent.

Google's renderer (WRS — Web Rendering Service) uses a recent version of Chromium, but it queues pages for rendering and processes them later. The gap between crawl and render can be hours or days. During that gap, Google sees only your initial HTML.

Bing's rendering is less capable and less consistent. AI engines like ChatGPT and Perplexity typically do not render JavaScript at all — they work with whatever HTML your server returns.

This means your initial HTML response is your SEO baseline. If it is empty, your baseline is zero content.

CSR vs SSR vs SSG vs Hybrid: What Each Means for SEO

There are four rendering strategies, and each has different SEO implications.

| Strategy | How It Works | Initial HTML | SEO Impact | Best For | |----------|-------------|-------------|------------|----------| | CSR (Client-Side Rendering) | Browser downloads JS bundle, renders in browser | Empty shell | Poor — crawlers see no content | Authenticated dashboards, internal tools | | SSR (Server-Side Rendering) | Server renders HTML on each request | Full content | Good — crawlers see complete page | Dynamic content, e-commerce, news | | SSG (Static Site Generation) | HTML pre-built at build time | Full content | Excellent — fastest TTFB, full content | Blogs, docs, landing pages, marketing | | Hybrid (SSR + Client Hydration) | Server renders, browser hydrates for interactivity | Full content | Good — content visible, interactive after hydration | Most modern web apps |

CSR: The SEO Problem

A standard React app (Create React App or Vite) ships this initial HTML:

<!DOCTYPE html>
<html>
  <head>
    <title>My App</title>
  </head>
  <body>
    <div id="root"></div>
    <script src="/assets/bundle.js"></script>
  </body>
</html>

A search crawler that does not execute JavaScript sees a page with no content, no headings, no text, and no links. Even Google, which does render JS, may take days to get to your page in the rendering queue.

SSR: Content on First Response

With SSR, the server executes your components and returns complete HTML:

<!DOCTYPE html>
<html>
  <head>
    <title>Product Page — My Store</title>
    <meta name="description" content="High-quality widget with free shipping..." />
  </head>
  <body>
    <div id="root">
      <h1>Premium Widget</h1>
      <p>High-quality widget with free shipping on orders over $50...</p>
      <!-- Full rendered content -->
    </div>
    <script src="/assets/bundle.js"></script>
  </body>
</html>

Every crawler sees the full page on the first request. No rendering queue. No JavaScript dependency.

SSG: Pre-Built and Fast

Static generation takes this further by building all HTML at deploy time. There is no server computation per request — pages are served from a CDN. This gives you the fastest possible time-to-first-byte (TTFB) and ensures every crawler gets complete content instantly.

Framework-by-Framework SEO Guide

Next.js

Next.js is the strongest choice for SEO among React frameworks. It supports SSR, SSG, and hybrid rendering out of the box. Pages are server-rendered by default in the App Router.

Detection markers: __NEXT_DATA__ script tag, /_next/ asset paths. Xyle automatically detects Next.js and its rendering mode when you crawl a page.

SEO strengths: Built-in metadata API, automatic sitemap generation, image optimization with next/image, SSR/SSG by default.

React (CRA / Vite)

Plain React with Create React App or Vite is CSR by default. Your pages ship as an empty shell until JavaScript executes in the browser.

SEO impact: Poor without additional setup. If you need SEO, you have two options: migrate to Next.js, or add a pre-rendering service.

# Check if your React app is CSR-only
# View source should show actual content, not just <div id="root"></div>
curl -s https://yoursite.com | grep -c "<h1>"
# If this returns 0, crawlers see no headings

Vue / Nuxt

Plain Vue is CSR. Nuxt adds SSR and SSG support for Vue, similar to what Next.js does for React. If you are building a Vue app that needs SEO, use Nuxt.

Detection markers: __NUXT__ or __NUXT_DATA__ in the HTML, /_nuxt/ asset paths.

SEO recommendation: Use Nuxt with ssr: true (the default) for all public-facing pages.

Angular / Angular Universal

Plain Angular is CSR. Angular Universal adds server-side rendering. The setup is more involved than Next.js or Nuxt, but it works.

Detection markers: ng-version attribute on the root element, ngsw service worker references.

SEO recommendation: Use Angular Universal or the newer Angular SSR (@angular/ssr) for any page that needs indexing.

Svelte / SvelteKit

SvelteKit renders on the server by default and produces minimal client-side JavaScript. The compiled output is significantly smaller than React or Angular bundles, which benefits Core Web Vitals.

Detection markers: SvelteKit-specific data attributes, /_app/ asset paths.

SEO recommendation: SvelteKit is excellent for SEO out of the box. Keep the default SSR behavior.

WordPress

WordPress is server-rendered by default — PHP generates complete HTML. SEO issues with WordPress are rarely about rendering. They are about plugin bloat slowing page speed, poor heading hierarchy in themes, and missing structured data.

Detection markers: wp-content paths, wp-json API, meta generator tag.

Gatsby

Gatsby generates static HTML at build time (SSG). Excellent for blogs, documentation, and marketing sites. Limited for dynamic content since every change requires a rebuild.

Detection markers: gatsby- prefixed elements, ___gatsby root element.

How to Detect Your Rendering Type

You can check your rendering type manually by viewing the page source (not the DevTools DOM — that shows the rendered result). If the source HTML contains your content, you are using SSR or SSG. If it contains only a shell, you are using CSR.

The faster way is to use Xyle:

$ xyle crawl --url https://yoursite.com --json

The output includes a rendering section:

{
  "rendering": {
    "framework": "Next.js",
    "rendering_type": "SSR",
    "is_spa": false,
    "has_client_hydration": true,
    "seo_impact": "Positive — full content available on initial HTML response"
  }
}

This tells you exactly what framework is in use, how the page renders, and whether it is an SPA — without manually digging through source code.

Fixing CSR SEO Problems

If your site is CSR and you need SEO, here are your options in order of preference.

Option 1: Migrate to an SSR Framework

The most robust solution. If you are using React, migrate to Next.js. If Vue, migrate to Nuxt. This gives you SSR/SSG with minimal configuration changes to your components.

# Next.js migration — your existing React components mostly work as-is
npx create-next-app@latest my-app --typescript
# Move your components to app/ or pages/ directory
# Add metadata exports for SEO

Option 2: Pre-Rendering Service

If migration is not feasible, a pre-rendering service runs a headless browser to generate static HTML snapshots of your pages. You serve these snapshots to crawlers while real users get the SPA experience.

// middleware.ts — detect crawler user agents and serve pre-rendered HTML
import { NextRequest, NextResponse } from "next/server";

const CRAWLER_AGENTS = [
  "googlebot", "bingbot", "slurp", "duckduckbot",
  "baiduspider", "yandexbot", "facebot", "ia_archiver",
];

export function middleware(request: NextRequest) {
  const userAgent = request.headers.get("user-agent")?.toLowerCase() || "";
  const isCrawler = CRAWLER_AGENTS.some((bot) => userAgent.includes(bot));

  if (isCrawler) {
    // Redirect to pre-rendered version
    const prerenderUrl = `https://prerender.yourservice.com/${request.url}`;
    return NextResponse.rewrite(prerenderUrl);
  }

  return NextResponse.next();
}

This is a workaround, not a solution. It adds complexity, latency, and a maintenance burden. Prefer migrating to SSR.

Option 3: Add Noscript Fallback Content

The simplest stopgap — add critical content in <noscript> tags so crawlers that do not execute JavaScript still see something:

<noscript>
  <h1>Your Page Title</h1>
  <p>Key content that crawlers should index...</p>
</noscript>

This is the weakest option. It only helps crawlers that completely skip JavaScript, and the content can easily drift from your actual rendered page.

Technical SEO Checks for JavaScript Sites

Beyond rendering, JavaScript sites have specific technical SEO pitfalls.

Canonical Tags in SPAs

Single-page applications change the URL via client-side routing without a page reload. Make sure your canonical tag updates with each route change:

// Next.js App Router — canonical is handled via metadata
export const metadata = {
  alternates: {
    canonical: "https://yoursite.com/current-page",
  },
};

For client-rendered SPAs, you need to dynamically update the canonical tag in the <head> when the route changes, or use a library like react-helmet:

import { Helmet } from "react-helmet";

function ProductPage({ slug }: { slug: string }) {
  return (
    <>
      <Helmet>
        <link rel="canonical" href={`https://yoursite.com/products/${slug}`} />
        <meta property="og:url" content={`https://yoursite.com/products/${slug}`} />
      </Helmet>
      <h1>Product Details</h1>
      {/* Page content */}
    </>
  );
}

Open Graph Tags with SSR

OG tags must be in the initial HTML response — social media crawlers never execute JavaScript. If you are using SSR, this works naturally through your framework's metadata API. If you are using CSR, OG tags will not work without pre-rendering.

Verifying Your Technical SEO

After fixing rendering issues, verify that your technical SEO checks pass:

$ xyle crawl --url https://yoursite.com --json

The technical_seo section shows whether your canonical, robots, viewport, charset, lang, HTTPS, OG tags, and heading structure are correct:

{
  "technical_seo": {
    "has_canonical": true,
    "has_robots_meta": true,
    "has_viewport": true,
    "has_charset": true,
    "has_lang": true,
    "is_https": true,
    "has_og_tags": true,
    "h1_count": 1,
    "title_length": 54,
    "meta_description_length": 148
  }
}

If any check fails, you know exactly what to fix.

Getting Started

Here is a three-step audit for your JavaScript site:

  1. Check your rendering. Run xyle crawl --url <url> --json and look at the rendering section. If it shows CSR, you have work to do.
  2. Verify content in source HTML. View your page source (not DevTools) and confirm your headings, text, and meta tags are present without JavaScript execution.
  3. Fix what is broken. Migrate to SSR if possible, add pre-rendering if not, and verify technical SEO checks pass.

Your framework choice is a foundation decision. Getting rendering right means every other SEO optimization you make — from structured data to content quality — actually reaches the crawlers. Getting it wrong means none of it matters.

Run your first crawl with Xyle and see exactly how search engines experience your JavaScript site.

Ready to optimize your search rankings?

Xyle connects to Google Search Console, analyzes content gaps with AI, and gives you actionable fixes — from the terminal or dashboard.

Read the Docs