Tutorial

How to Build a Competitor Analysis Tool with Python and a SERP API

By Serpent API Team· · 10 min read
Business strategy visualization for competitor analysis

Competitive intelligence is one of the highest-value applications for search data. Knowing where your competitors rank — and for which keywords — gives you actionable insight that would otherwise require expensive enterprise SEO tools or hours of manual research. With a SERP API and a few hundred lines of Python, you can build a tool that automates this entirely.

In this tutorial, we will build a complete competitor analysis CLI tool using Serpent API (at just $0.00005 per search, it is 200 times cheaper than SerpApi). The tool will accept a domain name and a list of target keywords, then produce a structured competitive intelligence report showing where each competitor ranks and where you have the most opportunity to outrank them.

What We'll Build

By the end of this tutorial, you will have a Python CLI tool that:

  • Accepts a target domain, a list of competitors, and a keyword list as input
  • Fetches live SERP results for each keyword via Serpent API
  • Identifies where each domain (yours and competitors) ranks for each keyword
  • Produces a summary matrix showing the full competitive landscape at a glance
  • Exports the results to a CSV file for sharing with stakeholders
  • Can be scheduled as a weekly cron job to track ranking changes over time

The output will look like this:

Competitor Analysis Report — 2026-02-16
Keywords analyzed: 5 | Domains tracked: 3

Keyword                        yoursite.com  competitor-a.com  competitor-b.com
----------------------------  ------------  ----------------  ----------------
python web scraping tutorial           3               1                 7
best SERP API 2026                     1               4                N/A
automated SEO tools                    N/A             2                 3
rank tracking software                 8               1                N/A
competitor analysis python             2               N/A               5

Average Position:                      3.5             2.0               5.0
Keywords in Top 3:                     2               3                 1
Keywords in Top 10:                    4               3                 2

Prerequisites and Setup

You will need Python 3.8 or later. Install the required packages with:

pip install requests pandas tabulate
  • requests — HTTP client for calling the Serpent API
  • pandas — DataFrame manipulation for building the analysis matrix
  • tabulate — Pretty-printing tables to the terminal

Get your free Serpent API key at apiserpent.com. The free tier includes 100 searches — enough to analyze 10 keywords across 10 competitors. Paid usage is billed at $0.00005 per search with no monthly minimum, so a weekly job analyzing 50 keywords costs just $0.0025 per run.

Step 1 — Fetch SERP Results for Target Keywords

Start by creating a file called competitor_analysis.py. The first function handles fetching SERP data for a single keyword:

import requests
import json
import time
from typing import Optional, List, Dict

API_KEY = "YOUR_API_KEY"

def get_serp_results(keyword: str, country: str = "us", num: int = 20) -> dict:
    """
    Fetch SERP results for a keyword via Serpent API.

    Args:
        keyword: The search query to look up.
        country: ISO 3166-1 alpha-2 country code.
        num: Number of results to return (max 100).

    Returns:
        Full JSON response from Serpent API.
    """
    response = requests.get(
        "https://apiserpent.com/api/search",
        params={
            "q": keyword,
            "num": num,
            "country": country,
            "apiKey": API_KEY
        },
        timeout=15
    )
    response.raise_for_status()
    return response.json()


# Test it
data = get_serp_results("best python web frameworks 2026")
organic = data.get("results", {}).get("organic", [])
print(f"Fetched {len(organic)} results")
print(f"Top result: {organic[0]['title']} — {organic[0]['url']}")

We request 20 results per keyword by default, which gives us visibility into the top two pages of search results. For most competitive intelligence purposes, tracking positions 1–20 is sufficient. If you need deeper data, increase num up to 100.

Step 2 — Extract Competitor Rankings

Next, add a function that checks whether a specific domain appears in the SERP results and returns its position data:

def find_domain_position(serp_data: dict, domain: str) -> Optional[Dict]:
    """
    Find the position of a domain in search results.

    Args:
        serp_data: Full Serpent API response JSON.
        domain: Domain to search for (e.g., 'example.com').

    Returns:
        Dict with position details, or None if domain not found.
    """
    organic = serp_data.get("results", {}).get("organic", [])
    domain_clean = domain.lower().replace("www.", "")

    for result in organic:
        result_url = result["url"].lower().replace("www.", "")
        if domain_clean in result_url:
            return {
                "position": result["position"],
                "title": result["title"],
                "url": result["url"],
                "snippet": result.get("snippet", "")
            }
    return None  # Domain not found in top results


# Test the function
serp = get_serp_results("python tutorials for beginners")
position_data = find_domain_position(serp, "realpython.com")
if position_data:
    print(f"Found at position {position_data['position']}: {position_data['title']}")
else:
    print("Domain not found in top 20 results")

The domain_clean normalization strips the www. prefix before comparison, so www.example.com and example.com will both match correctly. The function returns None when the domain is absent from the results — we will display this as "N/A" in the final report.

Data analysis interface for competitor research

Step 3 — Analyze Multiple Competitors Across Keywords

Now we can combine the two functions into a loop that builds a complete competitive intelligence matrix. This is the core of the tool:

import pandas as pd
from datetime import datetime

def run_competitor_analysis(
    keywords: List[str],
    domains: List[str],
    country: str = "us",
    num_results: int = 20,
    delay_seconds: float = 1.0
) -> pd.DataFrame:
    """
    Analyze ranking positions for multiple domains across multiple keywords.

    Args:
        keywords: List of search queries to analyze.
        domains: List of domains to track (include your own first).
        country: Country to localize results for.
        num_results: Number of SERP results to fetch per keyword.
        delay_seconds: Delay between API calls to avoid rate limiting.

    Returns:
        DataFrame with keywords as rows and domains as columns.
    """
    results = []

    for i, keyword in enumerate(keywords):
        print(f"[{i+1}/{len(keywords)}] Analyzing: '{keyword}'")

        try:
            serp_data = get_serp_results(keyword, country=country, num=num_results)
        except requests.exceptions.RequestException as e:
            print(f"  Error fetching '{keyword}': {e}")
            # Fill row with N/A on error
            row = {"keyword": keyword}
            for domain in domains:
                row[domain] = None
            results.append(row)
            continue

        row = {"keyword": keyword}
        for domain in domains:
            position_data = find_domain_position(serp_data, domain)
            row[domain] = position_data["position"] if position_data else None
            status = position_data["position"] if position_data else "N/A"
            print(f"  {domain}: position {status}")

        results.append(row)

        # Rate limiting between requests
        if i < len(keywords) - 1:
            time.sleep(delay_seconds)

    df = pd.DataFrame(results)
    df = df.set_index("keyword")
    return df


# --- Main Analysis ---
my_domain = "yoursite.com"
competitors = ["competitor-a.com", "competitor-b.com", "competitor-c.com"]
all_domains = [my_domain] + competitors

target_keywords = [
    "python web scraping tutorial",
    "best SERP API 2026",
    "automated SEO tools",
    "rank tracking software",
    "competitor analysis python"
]

print(f"Starting analysis: {len(target_keywords)} keywords x {len(all_domains)} domains")
print(f"Estimated API calls: {len(target_keywords)} (${len(target_keywords) * 0.0001:.4f})\n")

df = run_competitor_analysis(
    keywords=target_keywords,
    domains=all_domains,
    country="us",
    num_results=20
)

print("\nRaw results matrix:")
print(df.to_string())

At $0.00005 per search, analyzing 5 keywords costs $0.00025. Even a weekly job tracking 100 keywords costs just $0.005 per run, or roughly $0.26 per year. This is the economic advantage of building on Serpent API over paying $75/month minimums at SerpApi.

Step 4 — Generate a Report

Raw position data becomes actionable when you add summary statistics. This function transforms the DataFrame into a complete competitive intelligence report:

from tabulate import tabulate

def generate_report(df: pd.DataFrame, output_file: Optional[str] = None):
    """
    Generate and print a formatted competitive analysis report.

    Args:
        df: DataFrame from run_competitor_analysis().
        output_file: Optional CSV file path to save results.
    """
    print("\n" + "="*70)
    print(f"COMPETITOR ANALYSIS REPORT — {datetime.now().strftime('%Y-%m-%d')}")
    print("="*70)
    print(f"Keywords analyzed: {len(df)} | Domains tracked: {len(df.columns)}\n")

    # Replace None with "N/A" for display
    display_df = df.copy().fillna("N/A")

    print(tabulate(display_df, headers="keys", tablefmt="simple"))

    # Summary statistics
    print("\n" + "-"*70)
    print("SUMMARY STATISTICS")
    print("-"*70)

    summary_rows = []
    for domain in df.columns:
        col = df[domain].dropna()
        avg_pos = col.mean() if len(col) > 0 else None
        top3 = (col <= 3).sum()
        top10 = (col <= 10).sum()
        keywords_ranking = len(col)
        summary_rows.append({
            "Domain": domain,
            "Keywords Ranking": f"{keywords_ranking}/{len(df)}",
            "Avg Position": f"{avg_pos:.1f}" if avg_pos else "N/A",
            "Top 3": top3,
            "Top 10": top10
        })

    summary_df = pd.DataFrame(summary_rows).set_index("Domain")
    print(tabulate(summary_df, headers="keys", tablefmt="simple"))

    # Opportunity analysis
    print("\n" + "-"*70)
    print("OPPORTUNITY KEYWORDS (where you rank lower than a competitor)")
    print("-"*70)

    your_domain = df.columns[0]
    for keyword in df.index:
        your_pos = df.loc[keyword, your_domain]
        for competitor in df.columns[1:]:
            comp_pos = df.loc[keyword, competitor]
            if your_pos is not None and comp_pos is not None and comp_pos < your_pos:
                gap = your_pos - comp_pos
                print(f"  '{keyword}': You #{int(your_pos)} vs {competitor} #{int(comp_pos)} (gap: {gap})")

    # Export to CSV
    if output_file:
        df.to_csv(output_file)
        print(f"\nResults saved to: {output_file}")


generate_report(df, output_file=f"competitor_report_{datetime.now().strftime('%Y%m%d')}.csv")

The opportunity analysis section is the most actionable part of the report. It automatically surfaces keywords where a competitor outranks you, along with the position gap — telling you exactly where to focus your content improvement efforts first.

Team discussing competitor analysis strategy

Scheduling Weekly Reports

A one-time snapshot is useful, but the real value comes from tracking position changes over time. Schedule the tool to run automatically using Python's schedule library or a system cron job.

Using the schedule Library

import schedule
import time
from datetime import datetime

def weekly_analysis_job():
    """Run the full competitor analysis and save a timestamped report."""
    print(f"\n[{datetime.now()}] Starting scheduled competitor analysis...")

    df = run_competitor_analysis(
        keywords=target_keywords,
        domains=all_domains
    )

    timestamp = datetime.now().strftime("%Y%m%d")
    csv_file = f"reports/competitor_report_{timestamp}.csv"
    generate_report(df, output_file=csv_file)

    print(f"[{datetime.now()}] Analysis complete. Report saved.")

# Run every Monday at 8:00 AM
schedule.every().monday.at("08:00").do(weekly_analysis_job)

print("Competitor analysis scheduler started. Press Ctrl+C to stop.")
while True:
    schedule.run_pending()
    time.sleep(60)

Using Cron (Linux/macOS)

Alternatively, add a cron entry to run the script every Monday morning:

# Run every Monday at 8:00 AM
# crontab -e
0 8 * * 1 /usr/bin/python3 /path/to/competitor_analysis.py >> /var/log/serp_analysis.log 2>&1

Save each week's CSV with a timestamp in the filename. Over time, you can load multiple CSVs into a single DataFrame and plot position trends with matplotlib or export them to a Google Sheet for stakeholder visibility.

Going Further

SERP Feature Tracking

Beyond simple position tracking, Serpent API returns data on featured snippets, knowledge panels, image carousels, and People Also Ask boxes. Extend the analysis to track which competitors own these high-visibility SERP features for your target keywords — they often drive more clicks than traditional position 1 results.

Trend Analysis and Alerting

Add a comparison function that loads this week's CSV alongside last week's and flags any domain that gained or lost more than 3 positions on any keyword. Send these alerts via email or Slack webhook so your team knows immediately when the competitive landscape shifts.

Keyword Gap Discovery

For each competitor, fetch their top-ranking keywords by querying site:competitor.com and analyzing which keywords they rank for that you do not. This turns the tool from a rank tracker into a full keyword gap analyzer — something that typically requires a $200/month Ahrefs subscription.

For more SEO automation ideas, read our guides on automating SEO reports and building a keyword rank tracker.

Ready to Start Building?

Get started with Serpent API today. 100 free searches included, no credit card required.

Get Your Free API Key

Explore: SERP API · News API · Image Search API · Try in Playground