Build a Google Rank Tracker in 100 Lines of Python (2026 Working Code)

By Anurag Pathak· · 10 min read

Most rank-tracker tutorials are six thousand words long, three thousand lines of code, and end with a Docker compose file you will never run. This is not that. This is the 100-line weekend version: SQLite in a single file, one Python script, one cron entry, and a weekly email when something moves. It is the tracker I actually run on my own keywords.

Copy. Paste. Replace the API key. Run.

The Stack

That is it. No frameworks, no async, no docker. The whole script is one file, runs anywhere Python runs, and finishes in under 30 seconds for a 100-keyword list.

The Full Script

Save this as tracker.py:

import os, sqlite3, requests, smtplib
from datetime import date
from email.mime.text import MIMEText

API_KEY = os.environ["SERPENT_API_KEY"]
DOMAIN = os.environ.get("TRACK_DOMAIN", "yourdomain.com")
DB = "ranks.db"

KEYWORDS = [
    "best protein powder",
    "react vs vue",
    "ergonomic chair under 300",
    # ...add your list
]

def init_db():
    c = sqlite3.connect(DB)
    c.execute("""CREATE TABLE IF NOT EXISTS ranks (
        snap_date TEXT, keyword TEXT, position INTEGER,
        in_aio INTEGER, top_url TEXT,
        PRIMARY KEY (snap_date, keyword)
    )""")
    c.commit()
    return c

def fetch_rank(keyword):
    r = requests.get("https://apiserpent.com/api/search", params={
        "q": keyword, "engine": "google", "country": "us",
        "api_key": API_KEY,
    }, timeout=30)
    data = r.json()
    organic = data.get("organic_results", [])
    pos = next(
        (i + 1 for i, o in enumerate(organic)
         if (o.get("domain") or "").endswith(DOMAIN)),
        None,
    )
    aio_sources = (data.get("ai_overview") or {}).get("sources") or []
    in_aio = 1 if any(
        (s.get("domain") or "").endswith(DOMAIN) for s in aio_sources
    ) else 0
    top_url = organic[0]["url"] if organic else None
    return pos, in_aio, top_url

def snapshot():
    conn = init_db()
    today = date.today().isoformat()
    for kw in KEYWORDS:
        try:
            pos, in_aio, top_url = fetch_rank(kw)
            conn.execute(
                "INSERT OR REPLACE INTO ranks VALUES (?, ?, ?, ?, ?)",
                (today, kw, pos, in_aio, top_url),
            )
            print(f"{kw}: pos={pos} aio={in_aio}")
        except Exception as e:
            print(f"FAIL {kw}: {e}")
    conn.commit()
    conn.close()

def diff_report():
    conn = sqlite3.connect(DB)
    rows = conn.execute("""
        SELECT t.keyword, t.position AS today_pos, y.position AS yest_pos,
               t.in_aio AS today_aio, y.in_aio AS yest_aio
        FROM ranks t LEFT JOIN ranks y
          ON t.keyword = y.keyword
          AND y.snap_date = date(t.snap_date, '-7 day')
        WHERE t.snap_date = (SELECT MAX(snap_date) FROM ranks)
    """).fetchall()
    conn.close()

    moves = []
    for kw, tp, yp, ta, ya in rows:
        if tp != yp:
            moves.append(f"{kw}: {yp or '-'} -> {tp or '-'}")
        if ta != ya:
            moves.append(f"{kw} AIO: {ya} -> {ta}")
    return moves

def send_email(moves):
    if not moves:
        return
    body = "\n".join(moves)
    msg = MIMEText(body)
    msg["Subject"] = f"Rank changes ({len(moves)})"
    msg["From"] = os.environ["SMTP_FROM"]
    msg["To"] = os.environ["SMTP_TO"]
    with smtplib.SMTP(os.environ["SMTP_HOST"], 587) as s:
        s.starttls()
        s.login(os.environ["SMTP_USER"], os.environ["SMTP_PASS"])
        s.send_message(msg)

if __name__ == "__main__":
    snapshot()
    moves = diff_report()
    print("\n".join(moves) or "no moves")
    send_email(moves)

Line count: 96 with imports and blank lines. The whole thing fits on a single screen.

How to Run It

  1. Get an API key. Sign up at apiserpent.com — you get 10 free Google searches on signup, enough to test the script with a few keywords.
  2. Set environment variables.
    export SERPENT_API_KEY=sk_live_your_key
    export TRACK_DOMAIN=yourdomain.com
    export SMTP_HOST=smtp.resend.com
    export SMTP_USER=resend
    export SMTP_PASS=re_your_resend_key
    export SMTP_FROM=tracker@yourdomain.com
    export SMTP_TO=you@yourdomain.com
  3. Edit the keyword list. Replace the placeholder list at the top of the script with your real keywords.
  4. Test it once.
    python tracker.py
    You should see one line per keyword printed, then either a list of moves or "no moves" (the first run has nothing to diff against).
  5. Schedule it weekly. Add a cron entry:
    # Every Monday at 7am
    0 7 * * 1 cd /opt/tracker && /usr/bin/python tracker.py >> tracker.log 2>&1

What the Email Looks Like

After the second run, you get an email like:

Subject: Rank changes (4)

best protein powder: 7 -> 4
react vs vue: 12 -> 9
ergonomic chair under 300 AIO: 0 -> 1
buy iphone 17 case: 3 -> -

The "AIO: 0 -> 1" line means your domain just appeared as a citation inside the Google AI Overview for that query. The "3 -> -" line means you fell out of the top 100. Both are signals worth knowing immediately.

What to Add Next (If You Want)

The 100-line version covers the 80% case. Here are the obvious upgrades:

What This Tracker Will Not Do

To stay honest, here is what the 100-line version misses:

  1. Search volume. Position 1 for a 10-search-a-month query is a vanity metric. Pair this tracker with the keyword research API of your choice if you need volume.
  2. Local pack tracking. National rank only. For local pack at lat-long precision, see our local SEO rank tracking tutorial.
  3. Mobile vs desktop. Defaults to desktop. Add device=mobile to the API params and a column to the schema if you want both.
  4. Historical reporting. The diff email only compares today vs 7 days ago. For trend charts, use the SQLite data with any plotting library.

Cost Math

Tracking 100 keywords weekly = 400 queries per month. Per provider:

ProviderCost / month
Serpent API (Scale tier, quick search)$0.12
Serper.dev (volume pricing)$0.12
DataForSEO Standard Queue$0.24
SerpApi.com Developer plan$75 fixed (unused queries lost)

For a 1,000-keyword tracker (4,000 queries/month) the maths becomes more interesting: $1.20 on Serpent vs the same $75 fixed on SerpApi. Pay-as-you-go scales linearly; the subscription does not.

Get the SERP API Behind the Tracker

Serpent API gives every new account 10 free Google searches with full AI Overview text and source citations — no credit card. Enough to run the tracker with a small keyword list before you spend a cent.

Get Your Free API Key

Explore: SERP API · Pricing · Playground

FAQ

Can you really build a rank tracker in 100 lines?

Yes. The full code is 96 lines including imports and schema. SQLite handles storage. The SERP API handles the hard part (Google scraping, parsing, anti-bot). What is left is glue code.

How accurate is a DIY tracker compared to Ahrefs?

Comparable. SaaS tools layer historical context and dashboards on top of similar SERP infrastructure. For pure position tracking, a DIY tracker hits within 1 to 2 positions of Ahrefs in most spot checks.

How much does it cost to run?

At Serpent API Scale tier, tracking 100 keywords weekly costs about $0.12 a month. The same tracker on SerpApi would cost around $6. Storage is free if you use SQLite on a $5 VPS.

Can I track from a specific country or city?

Yes. Pass country in the API call. For city-level precision, pass lat-long coordinates. See our local SEO rank tracking tutorial for the city-grid version.

Why not scrape Google directly?

Direct scraping requires residential proxies, headless browsers, parser maintenance, and CAPTCHA handling. Total cost in proxies alone usually exceeds paying for a SERP API. Use the API for data and write your tracker in 100 lines instead of 5,000.

Can I use this code commercially?

Yes. Treat it as MIT-licensed. If you build something useful on top, a link back to this guide is appreciated but not required.