Python Tutorial

SERP API Python Tutorial: Scrape Google Results

Learn how to use Searlo's Google SERP API with Python. This guide covers basic search queries, geo-targeting, rank tracking, SERP feature extraction, and AI/LLM integration.

Python 3.6+
~15 min read
Beginner friendly

Prerequisites

  • • Python 3.6 or higher installed
  • • Free Searlo account (sign up here)
  • • Basic understanding of Python and REST APIs

Quick Start Steps

1

Get Your API Key

Sign up for free at dashboard.searlo.tech and copy your API key from the settings page.

2

Install Dependencies

Install the requests library for making HTTP calls.

3

Make Your First Request

Use the search endpoint to query Google and get structured results.

4

Parse the Response

Extract organic results, featured snippets, and other SERP features.

5

Build Your Application

Use the data for rank tracking, SEO analysis, or AI applications.

1. Installation

Install the requests library if you don't have it already:

pip install requests

2. Basic Google Search

Here's a simple function to search Google and get structured results:

import requests

API_KEY = "your_api_key_here"
BASE_URL = "https://api.searlo.tech/v1"

def search_google(query, num_results=10):
    """Search Google using Searlo API"""
    response = requests.get(
        f"{BASE_URL}/search",
        headers={"Authorization": f"Bearer {API_KEY}"},
        params={
            "q": query,
            "num": num_results,
        }
    )
    return response.json()

# Example usage
results = search_google("best python frameworks 2026")
for result in results.get("organic_results", []):
    print(f"{result['position']}. {result['title']}")
    print(f"   {result['link']}")

3. Geo-Targeted Search

Search from specific countries or cities to get localized results:

def search_localized(query, country="us", language="en"):
    """Search with geo-targeting"""
    response = requests.get(
        f"{BASE_URL}/search",
        headers={"Authorization": f"Bearer {API_KEY}"},
        params={
            "q": query,
            "gl": country,  # Country code
            "hl": language,  # Language
            "location": "New York, NY",  # Optional city
        }
    )
    return response.json()

# Search from UK
uk_results = search_localized("coffee shops", country="uk")

4. Build a Rank Tracker

Track your domain's position for multiple keywords:

import json
from datetime import datetime

def track_rankings(keywords, domain):
    """Track domain rankings for multiple keywords"""
    rankings = {}
    
    for keyword in keywords:
        results = search_google(keyword, num_results=100)
        position = None
        
        for result in results.get("organic_results", []):
            if domain in result.get("link", ""):
                position = result["position"]
                break
        
        rankings[keyword] = {
            "position": position,
            "date": datetime.now().isoformat(),
            "keyword": keyword
        }
    
    return rankings

# Track rankings for your domain
keywords = ["serp api", "google search api", "web scraping api"]
my_rankings = track_rankings(keywords, "searlo.tech")
print(json.dumps(my_rankings, indent=2))

5. Extract SERP Features

Get featured snippets, People Also Ask, AI Overviews, and more:

def extract_serp_features(query):
    """Extract all SERP features from results"""
    results = search_google(query)
    
    features = {
        "organic_results": len(results.get("organic_results", [])),
        "featured_snippet": results.get("featured_snippet"),
        "people_also_ask": results.get("people_also_ask", []),
        "knowledge_panel": results.get("knowledge_panel"),
        "related_searches": results.get("related_searches", []),
        "ai_overview": results.get("ai_overview"),  # Google SGE
    }
    
    return features

# Analyze SERP features
features = extract_serp_features("what is machine learning")
if features["featured_snippet"]:
    print("Featured Snippet:", features["featured_snippet"]["snippet"])
if features["ai_overview"]:
    print("AI Overview:", features["ai_overview"]["summary"])

6. AI/LangChain Integration

Use Searlo with LangChain for AI agent applications:

from langchain.agents import initialize_agent, Tool
from langchain.llms import OpenAI

class SearloSearchTool:
    """LangChain-compatible search tool"""
    
    def __init__(self, api_key):
        self.api_key = api_key
    
    def search(self, query: str) -> str:
        response = requests.get(
            "https://api.searlo.tech/v1/search",
            headers={"Authorization": f"Bearer {self.api_key}"},
            params={"q": query, "num": 5, "format": "toon"}
        )
        return response.text

# Use with LangChain
search_tool = Tool(
    name="Web Search",
    func=SearloSearchTool(API_KEY).search,
    description="Search the web for current information"
)

agent = initialize_agent(
    tools=[search_tool],
    llm=OpenAI(temperature=0),
    agent="zero-shot-react-description"
)

result = agent.run("What are the latest AI developments?")

Frequently Asked Questions

What Python version is required?

Searlo's API works with Python 3.6+. We recommend Python 3.9 or later for the best experience.

Is there an official Python SDK?

Yes! Install it with `pip install searlo`. It provides type hints and convenience methods. The examples here use raw requests for clarity.

How do I handle rate limits?

Searlo returns a 429 status code when rate limited. Implement exponential backoff or use our async endpoints for high-volume requests.

Can I use this for commercial projects?

Absolutely! Searlo is designed for production use. Our paid plans include commercial licensing and SLA guarantees.

Ready to start building?

Get 1,000 free searches per month. No credit card required.