Official Python SDK for the lobstr.io API. Typed models, sync & async clients, auto-pagination, and full API coverage — one dependency (httpx).
pip install lobstrio-sdkRequires Python 3.10+. Only dependency: httpx.
Everything you need to integrate lobstr.io into your Python applications.
Both LobstrClient and AsyncLobstrClient with identical API surfaces.
Dataclass models for every response — no raw dicts in the public API.
Lazy PageIterator streams all pages on demand with .iter() method.
Token resolved from explicit param, LOBSTR_TOKEN env, or ~/.config/lobstr/config.toml.
Full workflow: create squid, add tasks, run, get results.
from lobstrio import LobstrClient
client = LobstrClient() # auto-resolves token
# Get account info
user = client.me()
print(f"{user.email} — {client.balance().credits} credits")
# Create a squid and scrape
squid = client.squids.create(
crawler="google-maps-leads-scraper",
name="Restaurants Paris"
)
client.squids.update(squid.id, params={"country": "France", "max_results": 50})
client.tasks.add(squid=squid.id, tasks=[
{"url": "https://google.com/maps/search/restaurants+paris"}
])
# Start and wait for completion
run = client.runs.start(squid=squid.id)
run = client.runs.wait(run.id)
# Get results
for page in client.results.iter(run=run.id):
for place in page.data:
print(f"{place['name']} — {place['score']}★")import asyncio
from lobstrio import AsyncLobstrClient
async def main():
async with AsyncLobstrClient() as client:
squids = await client.squids.list()
for squid in squids.data:
print(f"{squid.name} ({squid.id})")
asyncio.run(main())Every SDK method mapped to its API endpoint.
| SDK Method | API Endpoint |
|---|---|
| client.me() | GET /v1/me |
| client.balance() | GET /v1/user/balance |
| SDK Method | API Endpoint |
|---|---|
| client.crawlers.list() | GET /v1/crawlers |
| client.crawlers.get(hash) | GET /v1/crawlers/{hash} |
| client.crawlers.params(hash) | GET /v1/crawlers/{hash}/params |
| client.crawlers.attributes(hash) | GET /v1/crawlers/{hash}/attributes |
| SDK Method | API Endpoint |
|---|---|
| client.squids.create(crawler, name) | POST /v1/squids |
| client.squids.list() | GET /v1/squids |
| client.squids.get(hash) | GET /v1/squids/{hash} |
| client.squids.update(hash, ...) | POST /v1/squids/{hash} |
| client.squids.empty(hash) | POST /v1/squids/{hash}/empty |
| client.squids.delete(hash) | DELETE /v1/squids/{hash} |
| SDK Method | API Endpoint |
|---|---|
| client.tasks.add(squid, tasks) | POST /v1/tasks |
| client.tasks.list(squid) | GET /v1/tasks |
| client.tasks.get(hash) | GET /v1/tasks/{hash} |
| client.tasks.upload(squid, file) | POST /v1/tasks/upload |
| client.tasks.upload_status(id) | GET /v1/tasks/upload/{id} |
| client.tasks.delete(hash) | DELETE /v1/tasks/{hash} |
| SDK Method | API Endpoint |
|---|---|
| client.runs.start(squid) | POST /v1/runs |
| client.runs.list(squid) | GET /v1/runs |
| client.runs.get(hash) | GET /v1/runs/{hash} |
| client.runs.stats(hash) | GET /v1/runs/{hash}/stats |
| client.runs.tasks(hash) | GET /v1/runtasks |
| client.runs.abort(hash) | POST /v1/runs/{hash}/abort |
| client.runs.download(hash) | GET /v1/runs/{hash}/download |
| client.runs.wait(hash) | Polls GET /v1/runs/{hash} |
| SDK Method | API Endpoint |
|---|---|
| client.results.list(run) | GET /v1/results |
| client.results.iter(run) | GET /v1/results (paginated) |
| SDK Method | API Endpoint |
|---|---|
| client.accounts.list() | GET /v1/accounts |
| client.accounts.get(hash) | GET /v1/accounts/{hash} |
| client.accounts.types() | GET /v1/accounts/types |
| client.accounts.sync(type, cookies) | POST /v1/synchronize |
| client.accounts.delete(hash) | DELETE /v1/accounts/{hash} |
| SDK Method | API Endpoint |
|---|---|
| client.delivery.email(squid, ...) | POST /v1/delivery |
| client.delivery.google_sheet(squid, ...) | POST /v1/delivery |
| client.delivery.s3(squid, ...) | POST /v1/delivery |
| client.delivery.webhook(squid, ...) | POST /v1/delivery |
| client.delivery.sftp(squid, ...) | POST /v1/delivery |