Web Scraping24 min read

Query Params and Search Forms

Learn how websites pass search filters through URLs and forms, and how to scrape filtered results using query parameters.

David Miller
December 21, 2025
0.0k0

Most websites allow searching or filtering: - product by category - jobs by city - news by keyword These filters are usually sent as **query parameters**. Example URL: site.com/search?q=python&page=2 Here: q = python page = 2 ## Why this matters Instead of clicking forms manually, you can: - change URL parameters - loop over values - collect large datasets --- ## Inspect query params Open browser DevTools → Network → search request. You may see: ?q=laptop&min_price=500 --- ## Using params with requests ```python import requests url = "https://example.com/search" params = { "q": "python", "page": 1 } res = requests.get(url, params=params) print(res.url) # full URL with params ``` --- ## Loop over search pages ```python for page in range(1, 4): params = {"q": "python", "page": page} res = requests.get("https://example.com/search", params=params) print("Scraping:", res.url) ``` --- ## Graph: query param flow ```mermaid flowchart LR A[Search Form] --> B[URL Params] B --> C[requests.get()] C --> D[Filtered Results] ``` ## Remember - Forms usually map to URL parameters - You can build URLs programmatically - This unlocks large datasets

#Python#Beginner#Forms