How to Automate SEO Reports with Search APIs
Why Automate SEO Reports
SEO reporting is one of those tasks that everyone agrees is important but nobody enjoys doing manually. Pulling ranking data, formatting it into a spreadsheet, adding charts, writing commentary, and emailing it to stakeholders every week takes hours of time that could be spent on actual optimization work.
Automated SEO reports solve this by turning a manual process into a system that runs on its own. Once set up, the system collects data, generates reports, and delivers them without any human intervention. You get consistent, timely reports that are always up to date, and your team gets hours of their week back.
The cost of automation is dramatically lower than most people expect. With the Serpent API's pricing from $0.01/1K (DDG Scale tier), tracking 50 keywords daily costs just pennies per month. Compare that to the hours of manual work saved, and the ROI is obvious.
What to Include in SEO Reports
An effective SEO report should tell a story, not just dump data. Here are the key sections that make a report actionable:
Keyword Rankings
The core of any SEO report. Track your target keywords and show their current positions, week-over-week changes, and trend direction. Highlight keywords that have improved significantly or dropped concerning amounts.
SERP Feature Presence
Modern SERPs include featured snippets, People Also Ask boxes, image packs, and video carousels. Track whether your pages appear in these features, as they can drive significant traffic even when you do not hold the top organic position.
Competitor Tracking
Monitor where your competitors rank for your target keywords. When a competitor moves up, it is worth investigating what they changed. This section helps you stay aware of competitive movements without manually checking.
Content Performance
Track which of your URLs rank for which keywords. Identify pages that rank for keywords they were not originally targeting (opportunities) and pages that have dropped out of the top results (problems).
Building the Data Pipeline
The data pipeline has three stages: collection, processing, and output. Here is the architecture:
// Pipeline architecture
// 1. COLLECT: Fetch SERP data from Serpent API
// 2. PROCESS: Parse results, compare with history, calculate changes
// 3. OUTPUT: Generate HTML report, optionally convert to PDF, send via email
const pipeline = {
async collect(keywords, domain, country) { /* ... */ },
process(currentData, historicalData) { /* ... */ },
generateReport(processedData) { /* ... */ }
};
Collecting Data with Serpent API
The collection phase queries the Serpent API for each target keyword and extracts the ranking data you need:
const fs = require('fs');
async function collectRankingData(config) {
const { apiKey, keywords, domain, country } = config;
const results = [];
for (const keyword of keywords) {
const params = new URLSearchParams({
q: keyword,
num: 50,
country: country || 'us'
});
const response = await fetch(
`https://apiserpent.com/api/search?${params}`,
{ headers: { 'X-API-Key': apiKey } }
);
const data = await response.json();
if (data.success) {
// Find our domain's position
const organic = data.results.organic || [];
const position = organic.findIndex(r => r.url.includes(domain));
// Extract SERP features
const features = {
hasPeopleAlsoAsk: !!(data.results.peopleAlsoAsk?.length),
hasRelatedSearches: !!(data.results.relatedSearches?.length),
hasNews: !!(data.results.news?.length),
hasImages: !!(data.results.images?.length)
};
// Extract top competitors
const competitors = organic.slice(0, 10).map((r, i) => ({
position: i + 1,
title: r.title,
url: r.url,
domain: new URL(r.url).hostname
}));
results.push({
keyword,
position: position >= 0 ? position + 1 : null,
url: position >= 0 ? organic[position].url : null,
features,
competitors,
timestamp: new Date().toISOString()
});
}
// Rate limiting
await new Promise(r => setTimeout(r, 1500));
}
return results;
}
// Collect data
const data = await collectRankingData({
apiKey: 'sk_live_your_api_key',
keywords: [
'project management tool',
'task management software',
'team collaboration app',
'best project tracker'
],
domain: 'example.com',
country: 'us'
});
Generating HTML Reports
With data in hand, generate a clean HTML report that is easy to read and share:
function generateHTMLReport(data, config) {
const date = new Date().toLocaleDateString('en-US', {
weekday: 'long', year: 'numeric',
month: 'long', day: 'numeric'
});
const keywordRows = data.map(item => {
const posDisplay = item.position
? `#${item.position}`
: '<span style="color: #ef4444;">Not ranked</span>';
return `
<tr>
<td>${item.keyword}</td>
<td>${posDisplay}</td>
<td>${item.url || 'N/A'}</td>
<td>${Object.entries(item.features)
.filter(([, v]) => v)
.map(([k]) => k.replace('has', ''))
.join(', ') || 'None'}</td>
</tr>`;
}).join('');
return `<!DOCTYPE html>
<html>
<head>
<title>SEO Report - ${date}</title>
<style>
body { font-family: Arial, sans-serif; max-width: 800px;
margin: 0 auto; padding: 20px; }
h1 { color: #111; border-bottom: 2px solid #0d9488;
padding-bottom: 10px; }
table { width: 100%; border-collapse: collapse; margin: 20px 0; }
th, td { padding: 12px; text-align: left;
border-bottom: 1px solid #e5e7eb; }
th { background: #f9fafb; font-weight: 600; }
.summary { background: #f0fdf4; border: 1px solid #0d9488;
border-radius: 8px; padding: 16px; margin: 20px 0; }
</style>
</head>
<body>
<h1>SEO Report</h1>
<p>Generated: ${date}</p>
<p>Domain: ${config.domain}</p>
<div class="summary">
<strong>Summary:</strong>
${data.filter(d => d.position).length} of ${data.length}
keywords ranking in top 50
</div>
<h2>Keyword Rankings</h2>
<table>
<thead>
<tr>
<th>Keyword</th>
<th>Position</th>
<th>URL</th>
<th>SERP Features</th>
</tr>
</thead>
<tbody>${keywordRows}</tbody>
</table>
</body>
</html>`;
}
// Generate and save
const html = generateHTMLReport(data, { domain: 'example.com' });
fs.writeFileSync('seo-report.html', html);
Creating PDF Reports
For stakeholders who prefer PDF attachments, you can convert your HTML report to PDF using a library like Puppeteer:
const puppeteer = require('puppeteer');
async function htmlToPdf(htmlPath, pdfPath) {
const browser = await puppeteer.launch({ headless: 'new' });
const page = await browser.newPage();
const htmlContent = fs.readFileSync(htmlPath, 'utf8');
await page.setContent(htmlContent, { waitUntil: 'networkidle0' });
await page.pdf({
path: pdfPath,
format: 'A4',
margin: { top: '20mm', bottom: '20mm', left: '15mm', right: '15mm' },
printBackground: true
});
await browser.close();
console.log(`PDF saved to ${pdfPath}`);
}
await htmlToPdf('seo-report.html', 'seo-report.pdf');
Scheduling with Cron and GitHub Actions
For local or VPS hosting, cron is the simplest scheduling option:
# Run every Monday at 8:00 AM
0 8 * * 1 /usr/bin/node /home/user/seo-reporter/index.js
For a fully cloud-based solution, GitHub Actions provides free scheduled workflows:
# .github/workflows/seo-report.yml
name: Weekly SEO Report
on:
schedule:
- cron: '0 8 * * 1' # Every Monday at 8 AM UTC
workflow_dispatch: # Manual trigger
jobs:
generate-report:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: '20'
- run: npm install
- run: node generate-report.js
env:
SERPENT_API_KEY: ${{ secrets.SERPENT_API_KEY }}
REPORT_EMAIL: ${{ secrets.REPORT_EMAIL }}
- uses: actions/upload-artifact@v4
with:
name: seo-report
path: reports/
Email Delivery
Automatically emailing reports to stakeholders closes the loop on full automation. Here is a simple approach using Nodemailer:
const nodemailer = require('nodemailer');
async function emailReport(reportPath, recipients) {
const transporter = nodemailer.createTransport({
host: process.env.SMTP_HOST,
port: 587,
auth: {
user: process.env.SMTP_USER,
pass: process.env.SMTP_PASS
}
});
const date = new Date().toLocaleDateString('en-US', {
month: 'long', day: 'numeric', year: 'numeric'
});
await transporter.sendMail({
from: '"SEO Reporter" <reports@example.com>',
to: recipients.join(', '),
subject: `Weekly SEO Report - ${date}`,
text: 'Your weekly SEO report is attached.',
html: '<p>Your weekly SEO report is attached.</p>',
attachments: [{
filename: `seo-report-${date}.pdf`,
path: reportPath
}]
});
console.log(`Report emailed to ${recipients.length} recipients`);
}
await emailReport('seo-report.pdf', [
'marketing@example.com',
'ceo@example.com'
]);
Putting It All Together
Here is the complete automated reporting script that ties all the pieces together:
const fs = require('fs');
async function runSEOReport() {
const config = {
apiKey: process.env.SERPENT_API_KEY || 'sk_live_your_api_key',
domain: 'example.com',
country: 'us',
keywords: [
'project management software',
'task tracking tool',
'team collaboration platform',
'agile project management',
'kanban board software'
],
recipients: ['team@example.com']
};
console.log('1. Collecting ranking data...');
const data = await collectRankingData(config);
console.log(` Collected data for ${data.length} keywords`);
console.log('2. Generating HTML report...');
const html = generateHTMLReport(data, config);
const htmlPath = `reports/seo-report-${Date.now()}.html`;
fs.mkdirSync('reports', { recursive: true });
fs.writeFileSync(htmlPath, html);
console.log('3. Converting to PDF...');
const pdfPath = htmlPath.replace('.html', '.pdf');
await htmlToPdf(htmlPath, pdfPath);
console.log('4. Emailing report...');
await emailReport(pdfPath, config.recipients);
console.log('Done! Report generated and sent.');
}
runSEOReport().catch(console.error);
This script is designed to run unattended. Set it up once, configure your environment variables, and let it deliver reports to your inbox automatically. If any step fails, the error is caught and logged so you can debug without the system silently breaking.
The modular design also makes it easy to extend. Want to add competitor tracking? Add another data collection function. Need a Slack notification instead of email? Swap the delivery method. The pipeline pattern keeps each concern separate and testable.
Ready to get started?
Sign up for Serpent API and get 100 free searches. No credit card required.
Try for FreeExplore: SERP API · News API · Image Search API · Try in Playground