Back to Blog

How to Use the Google Search Console API: A Developer's Guide

·Xyle Team
google search consolegsc apiseo automation

Google Search Console (GSC) is the single best source of truth for how your site performs in Google search. The web UI is fine for a quick glance, but if you want to build dashboards, automate reporting, or feed SEO data into your AI workflows, you need the API.

This guide walks you through setting up authentication, making your first query, and building something useful — all with working code.

Prerequisites

  • A Google Cloud project with the Search Console API enabled
  • A verified site in Google Search Console
  • Node.js 18+ (examples use TypeScript, but the API is language-agnostic)

Step 1: Authentication

The GSC API uses OAuth 2.0. You have two options:

Option A: Service Account (server-to-server)

Best for automated workflows and CI pipelines.

  1. Create a service account in Google Cloud Console
  2. Download the JSON key file
  3. Add the service account email as a user in GSC (Settings → Users → Add user)
import { google } from "googleapis";

const auth = new google.auth.GoogleAuth({
  keyFile: "./service-account.json",
  scopes: ["https://www.googleapis.com/auth/webmasters.readonly"],
});

const searchconsole = google.searchconsole({ version: "v1", auth });

Option B: OAuth 2.0 (user consent)

Best for apps where users connect their own GSC properties.

const oauth2Client = new google.auth.OAuth2(
  process.env.GOOGLE_CLIENT_ID,
  process.env.GOOGLE_CLIENT_SECRET,
  "http://localhost:3000/api/auth/callback"
);

// Generate consent URL
const authUrl = oauth2Client.generateAuthUrl({
  access_type: "offline",
  scope: ["https://www.googleapis.com/auth/webmasters.readonly"],
});

// After user consents, exchange code for tokens
const { tokens } = await oauth2Client.getToken(code);
oauth2Client.setCredentials(tokens);

Step 2: List Verified Sites

Before querying analytics, verify which sites the account has access to:

const sites = await searchconsole.sites.list();

for (const site of sites.data.siteEntry || []) {
  console.log(`${site.siteUrl} — ${site.permissionLevel}`);
}

Output:

https://example.com/ — siteOwner
https://blog.example.com/ — siteFullUser

Step 3: Query Search Analytics

This is the core endpoint. It returns queries, pages, clicks, impressions, CTR, and position data.

const response = await searchconsole.searchanalytics.query({
  siteUrl: "https://example.com",
  requestBody: {
    startDate: "2026-01-01",
    endDate: "2026-03-20",
    dimensions: ["query", "page"],
    rowLimit: 25,
    dataState: "final",
  },
});

for (const row of response.data.rows || []) {
  const [query, page] = row.keys!;
  console.log(
    `"${query}" → ${page} | pos ${row.position?.toFixed(1)} | ${row.clicks} clicks`
  );
}

Dimensions You Can Use

| Dimension | Description | |-----------|-------------| | query | The search term the user typed | | page | The URL that appeared in results | | device | DESKTOP, MOBILE, or TABLET | | country | ISO 3166-1 alpha-3 country code | | date | Individual date (for time series) | | searchAppearance | Rich result type (e.g., RECIPE, FAQ) |

Filtering Results

Narrow your data with dimension filters:

const response = await searchconsole.searchanalytics.query({
  siteUrl: "https://example.com",
  requestBody: {
    startDate: "2026-01-01",
    endDate: "2026-03-20",
    dimensions: ["query"],
    dimensionFilterGroups: [
      {
        filters: [
          {
            dimension: "page",
            operator: "contains",
            expression: "/blog/",
          },
          {
            dimension: "query",
            operator: "notContains",
            expression: "brand-name",
          },
        ],
      },
    ],
    rowLimit: 100,
  },
});

Step 4: Build Something Useful

Automated Position Tracking

Track how your rankings change over time by storing daily snapshots:

async function trackPositions(siteUrl: string, queries: string[]) {
  const today = new Date().toISOString().split("T")[0];
  const sevenDaysAgo = new Date(Date.now() - 7 * 86400000)
    .toISOString()
    .split("T")[0];

  const response = await searchconsole.searchanalytics.query({
    siteUrl,
    requestBody: {
      startDate: sevenDaysAgo,
      endDate: today,
      dimensions: ["query", "date"],
      dimensionFilterGroups: [
        {
          filters: queries.map((q) => ({
            dimension: "query" as const,
            operator: "equals" as const,
            expression: q,
          })),
        },
      ],
    },
  });

  // Group by query and detect position changes
  const byQuery = new Map<string, { date: string; position: number }[]>();

  for (const row of response.data.rows || []) {
    const [query, date] = row.keys!;
    if (!byQuery.has(query)) byQuery.set(query, []);
    byQuery.get(query)!.push({ date, position: row.position! });
  }

  for (const [query, data] of byQuery) {
    const sorted = data.sort((a, b) => a.date.localeCompare(b.date));
    const first = sorted[0].position;
    const last = sorted[sorted.length - 1].position;
    const delta = first - last; // positive = improved
    console.log(
      `"${query}": ${last.toFixed(1)} (${delta > 0 ? "↑" : "↓"} ${Math.abs(delta).toFixed(1)})`
    );
  }
}

Content Gap Detection

Find queries where you rank on page 2 (positions 11-20) — these are your biggest opportunities:

async function findContentGaps(siteUrl: string) {
  const response = await searchconsole.searchanalytics.query({
    siteUrl,
    requestBody: {
      startDate: "2026-01-01",
      endDate: "2026-03-20",
      dimensions: ["query", "page"],
      rowLimit: 1000,
    },
  });

  const gaps = (response.data.rows || [])
    .filter((row) => row.position! >= 8 && row.position! <= 20)
    .filter((row) => row.impressions! > 100)
    .sort((a, b) => b.impressions! - a.impressions!);

  console.log("Content gaps (high impressions, page 2):");
  for (const row of gaps.slice(0, 20)) {
    const [query, page] = row.keys!;
    console.log(
      `  "${query}" → pos ${row.position?.toFixed(1)} | ${row.impressions} imp | ${page}`
    );
  }
}

Rate Limits & Best Practices

The GSC API has the following limits:

  • Queries per minute: 1,200 (per project)
  • Row limit per request: 25,000 max
  • Date range: Up to 16 months of data
  • Data freshness: Usually 2-3 days behind

Tips:

  • Cache responses aggressively — GSC data does not change retroactively for finalized dates.
  • Use dataState: "final" for accurate data (excludes the most recent 2-3 days).
  • Paginate with startRow for large result sets.
  • Store historical data in your own database — GSC only retains 16 months.

Skip the Boilerplate with Xyle

If you would rather skip the auth setup, pagination logic, and data storage, Xyle handles all of it. Connect your GSC properties in one click, then access your data via the dashboard, CLI, or REST API.

$ npm install -g @xyleapp/cli
$ xyle login
$ xyle queries --site example.com --limit 20
$ xyle gaps --site example.com

Xyle syncs your GSC data automatically, runs AI analysis on content gaps, detects AEO signals (schema markup, content structure, quality metrics), and gives you dual SEO + AEO scores — no OAuth setup required. Use the dashboard analysis page for visual results or the CLI for automation.

Whether you build your own integration or use a tool like Xyle, the key takeaway is this: your GSC data is too valuable to only view in a web UI. Programmatic access unlocks automated monitoring, AI-powered optimization for both search engines and AI answer engines, and workflows that scale with your site.

Ready to optimize your search rankings?

Xyle connects to Google Search Console, analyzes content gaps with AI, and gives you actionable fixes — from the terminal or dashboard.

Read the Docs