Smart_SEO_Media
IT
12 min read Tutorial

How I Use Search Console + Python to Find SEO Quick Wins in 10 Minutes

No paid tools required. Just the GSC API, a service account, and 50 lines of Python. Here's the exact script I use every week — with real data from my own site.

The logic behind SEO quick wins is simple: keywords in positions 6-20 have already been noticed by Google. The site has demonstrated relevance. All that's needed is a small push — a more precise title tag, an extra paragraph, an internal link — to climb above the click threshold.

The challenge is finding that push. Google Search Console shows all the data, but navigating it manually for every site is slow and imprecise. I've automated the process. Every Monday morning, a Python script connects to the GSC API, downloads the last 28 days of performance, filters keywords in the target range, and returns a sorted list of opportunities with estimated traffic gain.

The whole thing takes under 10 minutes — including time to read the report.

Why Position 6-20 Is the Hidden Treasure

The average CTR by position in Google follows an exponential curve. Position 1 captures around 28% of clicks, position 3 gets 11%, position 5 gets 7%. Drop to position 6 and CTR falls below 5%. Position 10: about 2%. Below page one: almost zero.

Expected CTR by position

Pos 1  → ~28% CTR
Pos 3  → ~11% CTR
Pos 5  →  ~7% CTR
Pos 10 →  ~2% CTR   ← visibility threshold
Pos 20 → ~0.5% CTR  ← nearly invisible

A keyword with 1,000 monthly impressions at position 8 generates about 20 clicks. If it rises to position 3, that becomes 110 clicks. Same keyword, same content — just improved — and traffic increases 5x. That's why I focus on this range.

Setup: Connect the GSC API with a Service Account

Before writing any Python, you need to configure authentication. A service account authenticates silently without human interaction — ideal for automated scripts.

setup — step 1

# Go to console.cloud.google.com
# 1. New project → name: "seo-tools"
# 2. APIs & Services → Library → search "Google Search Console API" → Enable
# 3. Credentials → Create credentials → Service account
#    Name: seo-agent
#    Click on the account → Keys tab → Add key → JSON
#    Download service-account.json

pip install google-auth google-auth-httplib2 google-api-python-client

Then go to Search Console → Settings → Users and permissions → add the service account email with Owner permission. Without this step, the API returns an empty object even if credentials are valid.

The Complete Script: Quick Wins in One Shot

Here's the script. It authenticates, downloads the last 28 days of queries, filters those in the 6-20 range with impressions above threshold, and prints a list sorted by estimated traffic gain:

gsc_quick_wins.py

from google.oauth2 import service_account
from googleapiclient.discovery import build
from datetime import date, timedelta
import json

KEY_FILE = 'service-account.json'
SITE_URL = 'https://yoursite.com/'
SCOPES   = ['https://www.googleapis.com/auth/webmasters.readonly']

MIN_IMPRESSIONS = 3
POS_MIN = 6.0
POS_MAX = 20.0
DAYS    = 28

EXPECTED_CTR = {
    1: 0.28, 2: 0.15, 3: 0.11, 4: 0.08, 5: 0.07,
    6: 0.05, 7: 0.04, 8: 0.03, 9: 0.025, 10: 0.02
}

def get_service():
    creds = service_account.Credentials.from_service_account_file(
        KEY_FILE, scopes=SCOPES)
    return build('searchconsole', 'v1', credentials=creds)

def fetch_queries(service, start_date, end_date):
    response = service.searchanalytics().query(
        siteUrl=SITE_URL,
        body={
            'startDate': str(start_date),
            'endDate': str(end_date),
            'dimensions': ['query', 'page'],
            'rowLimit': 5000,
            'orderBy': [{'fieldName': 'impressions', 'sortOrder': 'DESCENDING'}]
        }
    ).execute()
    return response.get('rows', [])

def find_quick_wins(rows):
    opportunities = []
    for row in rows:
        query  = row['keys'][0]
        page   = row['keys'][1]
        clicks = row['clicks']
        imps   = row['impressions']
        ctr    = row['ctr']
        pos    = row['position']

        if not (POS_MIN <= pos <= POS_MAX) or imps < MIN_IMPRESSIONS:
            continue

        target_ctr = EXPECTED_CTR.get(3, 0.11)
        estimated_gain = round((target_ctr - ctr) * imps)

        opportunities.append({
            'query': query,
            'page': page.replace(SITE_URL, '/'),
            'position': round(pos, 1),
            'impressions': int(imps),
            'clicks': int(clicks),
            'estimated_gain': max(0, estimated_gain)
        })

    return sorted(opportunities, key=lambda x: x['estimated_gain'], reverse=True)

def main():
    end   = date.today()
    start = end - timedelta(days=DAYS)

    service = get_service()
    rows    = fetch_queries(service, start, end)
    opportunities = find_quick_wins(rows)

    print(f"\n{'='*65}")
    print(f"  SEO QUICK WINS — {date.today()}")
    print(f"  Site: {SITE_URL} | Window: {DAYS} days")
    print(f"{'='*65}\n")

    if not opportunities:
        print("  No quick wins found with current filters.")
        return

    print(f"  {'QUERY':<40} {'POS':>5} {'IMP':>6} {'GAIN':>6}")
    print(f"  {'-'*65}")
    for i, opp in enumerate(opportunities[:20], 1):
        q = opp['query'][:38] + '..' if len(opp['query']) > 38 else opp['query']
        print(f"  {i:<4} {q:<40} {opp['position']:>5} "
              f"{opp['impressions']:>6} {opp['estimated_gain']:>5}+")

    with open('quick_wins.json', 'w') as f:
        json.dump(opportunities, f, ensure_ascii=False, indent=2)

if __name__ == '__main__':
    main()

What to Do with Each Quick Win

Keywords in pos 6-10 with high impressions

Action: optimize title tag and meta description

You're on page one but CTR is low. The problem is almost always the title — not specific enough, not intent-aligned. Rewrite the title tag with the target keyword and a concrete number or promise. Update the meta description with an explicit call to action.

Keywords in pos 11-15

Action: expand content and add internal links

You're on page two. Google sees you as relevant but not authoritative enough on that topic. Add 300-500 words of quality content — practical examples, FAQs, deeper explanations. Then add 2-3 internal links from authoritative site pages pointing to this one with relevant anchor text.

Keywords in pos 16-20

Action: consider creating a dedicated page

You might be ranking 16-20 because you're using a generic page for a keyword that deserves dedicated content. Check if the keyword is specific enough to justify a dedicated article or landing page. If yes, create the dedicated content.

Automate: Weekly Report with Cron

crontab — every Monday at 8am

# Add with: crontab -e
0 8 * * 1 cd /path/to/scripts && python3 gsc_quick_wins.py >> logs/quick_wins.log 2>&1

Every Monday morning you get the week's opportunity list without opening a browser. Add a Slack or email notification and it lands directly in your inbox.

Want me to set this up for your site?

I configure the GSC pipeline, identify existing quick wins, and deliver a prioritized action list with estimated traffic gain. No additional paid tools.

Request Free Audit

Keep Reading