How to Turn Claude Code Into Your SEO Command Center
Set up Claude Code to pull GSC, GA4, and Google Ads data, then ask cross-source questions that would take hours with spreadsheets. Step-by-step setup guide.
Key Takeaways
- Claude Code + Google APIs lets you cross-reference GSC, GA4, and Google Ads data in seconds using natural language questions
- Setup time: About an hour for initial config, then 5 minutes per monthly data refresh
- Biggest win: Paid-organic overlap analysis can surface spend reduction opportunities and content gaps across your accounts in a single query
- No dashboards required. Data lands in JSON files, and you talk to Claude Code directly
- Technical requirement: A Google Cloud service account (free) and basic comfort with the terminal
- Start with GSC data first for the fastest value, then layer in GA4 and Ads
Most SEO work means switching between Google Search Console, GA4, Google Ads, and whatever AI tool you're testing this week. Each platform has its own export format, its own dashboard, its own way of making simple questions hard to answer.
Will Scott, who runs the agency Search Influence, described the core problem in a Search Engine Land tutorial (March 2026): cross-referencing data across these platforms takes an afternoon with spreadsheets. With Claude Code, you can ask the same questions conversationally and get answers from the combined data.
This guide walks you through the exact setup: connect your Google APIs, fetch data into JSON files, and ask Claude Code questions that span all three sources.
What You're Building
A project directory where Claude Code has access to Python scripts that pull live data from your Google APIs. You fetch the data, it lands in JSON files, and then you talk to it.
No dashboards to build. No Looker Studio templates to maintain.
seo-project/
âââ config.json # Client details + API property IDs
âââ fetchers/
â âââ fetch_gsc.py # Google Search Console
â âââ fetch_ga4.py # Google Analytics 4
â âââ fetch_ads.py # Google Ads search terms
âââ data/
â âââ gsc/ # Query + page performance
â âââ ga4/ # Traffic by channel, top pages
â âââ ads/ # Search terms, spend, conversions
âââ reports/ # Generated analysis
Step 1: Set Up Google API Authentication
Everything runs through a Google Cloud service account. One service account covers both GSC and GA4.
Service Account (for GSC + GA4)
- Create a project in Google Cloud Console
- Enable the Search Console API and Google Analytics Data API
- Create a service account under IAM & Admin > Service Accounts
- Download the JSON key file
- Add the service account email as a user in your GSC property (read access)
- Add it as a Viewer in your GA4 property
The service account email looks like your-project@your-project-id.iam.gserviceaccount.com. Add this email to each client's GSC and GA4 properties the same way you'd add a team member.
For agencies: One service account works across all clients. Add it to each property, update a config file with the property IDs, and you're set.
Google Ads Authentication
Google Ads needs a separate OAuth setup:
- A developer token from Google Ads API Center (Tools & Settings > Setup > API Center)
- OAuth 2.0 credentials from Google Cloud (separate from the service account)
- A one-time browser authentication to generate a refresh token
If you don't have API access yet, download 90 days of keyword and search terms data as CSVs from the Google Ads UI and drop them in your data directory. Claude Code works with those fine.
Install Dependencies
pip install google-api-python-client google-auth google-analytics-data google-ads
CC for SEO Command Center
Pre-built Claude Code skills for technical audits, keyword clustering, and GSC/GA4 analysis.
Join the WaitlistBe the first to get access
Step 2: Build the Data Fetchers
Each fetcher is a Python script that authenticates, pulls data, and saves JSON. You don't need to write these from scratch. Describe what you want to Claude Code and it writes them.
Google Search Console Fetcher
from google.oauth2 import service_account
from googleapiclient.discovery import build
SCOPES = ['https://www.googleapis.com/auth/webmasters.readonly']
def get_gsc_service():
credentials = service_account.Credentials.from_service_account_file(
'service-account-key.json', scopes=SCOPES
)
return build('searchconsole', 'v1', credentials=credentials)
def fetch_queries(service, site_url, start_date, end_date):
response = service.searchanalytics().query(
siteUrl=site_url,
body={
'startDate': start_date,
'endDate': end_date,
'dimensions': ['query'],
'rowLimit': 1000
}
).execute()
return response.get('rows', [])
You get back queries with clicks, impressions, CTR, and average position. Save it as JSON.
GA4 Fetcher
from google.analytics.data_v1beta import BetaAnalyticsDataClient
from google.analytics.data_v1beta.types import (
RunReportRequest, DateRange, Metric, Dimension
)
def fetch_traffic_by_channel(client, property_id, start_date, end_date):
request = RunReportRequest(
property=f"properties/{property_id}",
date_ranges=[DateRange(start_date=start_date, end_date=end_date)],
dimensions=[Dimension(name="sessionDefaultChannelGroup")],
metrics=[
Metric(name="sessions"),
Metric(name="totalUsers"),
Metric(name="bounceRate"),
]
)
return client.run_report(request)
Google Ads Fetcher
Google Ads uses Google Ads Query Language (GAQL):
from google.ads.googleads.client import GoogleAdsClient
client = GoogleAdsClient.load_from_storage("google-ads.yaml")
ga_service = client.get_service("GoogleAdsService")
query = """
SELECT
search_term_view.search_term,
metrics.impressions,
metrics.clicks,
metrics.cost_micros,
metrics.conversions
FROM search_term_view
WHERE segments.date DURING LAST_30_DAYS
ORDER BY metrics.impressions DESC
"""
response = ga_service.search(customer_id="1234567890", query=query)
Step 3: Create a Client Config
One JSON file per client with property IDs and context:
{
"name": "Client Name",
"domain": "example.com",
"gsc_property": "https://www.example.com/",
"ga4_property_id": "319491912",
"google_ads_customer_id": "9270739126",
"industry": "Higher Education",
"competitors": [
"https://competitor1.com/",
"https://competitor2.com/"
]
}
Get Weekly Claude Code SEO Tips
Workflows, skills, and tactics for SEO professionals using Claude Code.
No spam. Unsubscribe anytime.
Step 4: Ask Cross-Source Questions
With GSC + GA4 + Ads data in JSON files, Claude Code can cross-reference everything.
The Paid-Organic Gap Analysis
The single most valuable question:
Compare the GSC query data against the Google Ads search terms. Find keywords where we're paying for clicks but already have strong organic positions. Also, find keywords where we're spending on ads with zero organic visibility. Those are content gaps.
When Will Scott ran this for a higher education client, it identified (Search Engine Land, March 2026):
| Finding | Count |
|---|---|
| Search terms with wasted ad spend (impressions, zero clicks) | 2,742 |
| Opportunities to reduce paid spend (strong organic position) | 351 |
| High-performing organic queries paid could amplify | 33 |
| Content gaps (paid-only, no organic presence) | 41 |
That analysis took about 90 seconds. The manual equivalent takes most of an afternoon.
Other Questions Worth Asking
Once you have multiple data sources loaded:
- "Which pages get the most impressions in GSC but have low CTR? What's the traffic from GA4 for those same pages?" (meta description/title opportunities)
- "What are the top 20 organic queries by impression that we're not running ads against?" (paid amplification candidates)
- "Group the GSC queries by topic cluster and show which clusters have the most impressions but lowest average position." (content investment priorities)
- "Which pages in GA4 have high bounce rates but strong GSC positions?" (content improvement targets)
Claude Code handles the cross-referencing in seconds. You can follow up with another question without rebuilding the analysis from scratch.
CC for SEO Command Center
Pre-built Claude Code skills for technical audits, keyword clustering, and GSC/GA4 analysis.
Join the WaitlistBe the first to get access
Step 5: Add AI Visibility Tracking
Traditional SERP positions aren't the full picture anymore. Between Google's AI Overviews, AI Mode, Copilot, ChatGPT, and Perplexity, you need to know whether AI systems are citing your content.
Tracking Options
| Tool | Cost | What It Covers |
|---|---|---|
| Bing Webmaster Tools | Free | Copilot/Bing AI citations (first-party data) |
| DataForSEO AI Overview API | ~$0.01/query ($50 min) | Google AI Overview citations + LLM mentions |
| SerpApi | From $75/mo | Full SERP including AI Overviews |
| SearchAPI.io | From $40/mo | SERP + Google AI Mode API |
| Direct LLM API calls | Under $20/mo | Brand mention monitoring via OpenAI, Anthropic, Perplexity APIs |
Start with Bing Webmaster Tools (free, first-party data) and one SERP API for Google AI Overview data. Layer in direct LLM API calls if budget allows.
Export the data as CSV or JSON and drop it in your data directory. Claude Code cross-references AI citations against your GSC and Ads data.
The Workflow in Practice
Setup (once per client, ~15 min):
- Add service account email to client's GSC and GA4
- Get their Google Ads customer ID
- Create a config.json with property IDs
Monthly data pull (~5 min):
python3 run_fetch.py --sources gsc,ga4,ads
Analysis (as needed): Open Claude Code in the project directory and ask questions. The data is there.
Output: Claude Code generates a markdown report. Push to Google Docs for client-facing deliverables.
Total for a new client: about 35 minutes (setup + fetch + analysis). Monthly refreshes: about 20 minutes including analysis time.
What This Doesn't Replace
Claude Code reads your data and finds patterns across sources faster than you can manually. It's not telling you what to do about those patterns. You still need someone who understands the client's business, their competitive situation, and what they're trying to accomplish.
Verify what it gives you. LLMs can hallucinate, including during data analysis. Spot-check the numbers against the source data. If something looks too clean or too dramatic, look at the raw file.
This also doesn't replace your existing platforms. For historical trend data, automated alerts, or client-facing dashboards, you still want Semrush or Ahrefs. What this gives you is the ability to ask ad hoc questions across multiple data sources, something none of those platforms does well alone.
FAQ
How long does the initial setup take?
About an hour for the first client: creating the Google Cloud project, setting up the service account, enabling APIs, and running the first data fetch. Subsequent clients take 15 minutes since the service account and scripts are already in place.
Do I need coding experience to use this?
Basic terminal comfort helps, but you don't need to write the Python scripts yourself. Describe what you want to Claude Code ("pull the top 1,000 queries from Search Console for the last 90 days") and it handles the API authentication, endpoints, and query parameters. It already knows these APIs.
Can this work for multiple clients?
Yes. One service account covers all clients. Create a config.json per client with their property IDs, and switch between them. Agencies running 5-50 accounts will see the biggest time savings from this setup.
What's the cost to run this?
The Google APIs are free. Claude Code requires a Claude Pro or Max subscription ($20/month+). The AI visibility tracking APIs range from free (Bing Webmaster Tools) to $40-75/month for SERP APIs. Total cost: $20-95/month depending on your stack.
Is the data analysis accurate?
Mostly, but treat it like work from a new analyst: trust but verify. Claude Code occasionally reports numbers that don't match the source JSON. It's rare, but it happens. Always spot-check before sharing anything with clients.
How does this compare to Looker Studio or Data Studio dashboards?
Different use cases. Dashboards are for ongoing monitoring and client-facing reporting. This setup is for ad hoc analysis and cross-source questions you'd never build a dashboard for. Use both.

Founder, CC for SEO
Martech PM & SEO automation builder. Bridges marketing, product, and engineering teams. Builds CC for SEO to help SEO professionals automate workflows with Claude Code.
次ã«èªã
How to Get Your Brand Cited in AI Search Results
AI search engines cite 18% of optimized brands vs 3% of everyone else. Here's how to structure content, build authority, and automate AEO workflows with Claude Code.
AI VisibilityAI Search Visibility for SEOs: How to Get Cited by ChatGPT, Perplexity, and Google AI
A practical guide for SEO professionals on getting content cited in AI search results. Covers AEO fundamentals, content structuring for citations, and Claude Code workflows for monitoring AI visibility.
AI VisibilityAI Visibility Tools for SEO: SaaS Platforms vs Claude Code Workflows
Compare 9 AI visibility platforms (Peec AI, Scrunch, Semrush) against building your own monitoring with Claude Code. Honest breakdown of when to buy vs build.
SEOã¯ãŒã¯ãããŒãèªåå
ãã¯ãã«ã«ç£æ»ãããŒã¯ãŒãã¯ã©ã¹ã¿ãªã³ã°ãã³ã³ãã³ãæé©åãGSC/GA4åæã®ããã®Claude Codeã¹ãã«ã
Join the Waitlist