Command Line Interface

lobstr CLI

Run scrapers, manage squids, download results — all from your terminal. One command to go from zero to CSV.

Installation

Terminal
pip install lobstrio

Requires Python 3.10+. Installs the lobstr command.

Features

Full control over lobstr.io from your terminal.

One-Command Scrape

The go command combines create, configure, add tasks, run, and download in a single command.

50+ Crawlers

Browse, search, and use any crawler by slug — Google Maps, LinkedIn, Twitter, and more.

CSV / JSON Export

Download results as CSV or JSON. Live progress bar while waiting for runs to complete.

Delivery Config

Set up email, Google Sheets, S3, SFTP, or webhook delivery from the terminal.

Quick Start

The go command does everything in one shot.

Terminal
# Scrape Google Maps in one command
lobstr go google-maps-leads-scraper \
  "https://google.com/maps/search/restaurants+paris" \
  -o leads.csv

# Keyword-based search
lobstr go google-search-scraper "pizza delivery" --key keyword

# With parameters and concurrency
lobstr go google-maps-leads-scraper url1 url2 \
  --param max_results=200 --concurrency 3

# Don't download, just kick off
lobstr go twitter-profile-scraper @elonmusk --no-download

Step-by-Step Workflow

Terminal
# Step by step
lobstr crawlers search "google maps"
lobstr squid create google-maps-leads-scraper --name "Paris Restaurants"
lobstr task add SQUID_ID "https://google.com/maps/search/restaurants+paris"
lobstr run start SQUID_ID --wait
lobstr results get SQUID_ID --format csv -o results.csv

# Check who you are
lobstr whoami

Command Reference

All available commands grouped by resource.

Crawlers
lobstr crawlers ls
lobstr crawlers search <query>
lobstr crawlers show <slug>
lobstr crawlers params <slug>
lobstr crawlers attrs <slug>
Squids
lobstr squid create <crawler> --name "My Scraper"
lobstr squid ls
lobstr squid show <id>
lobstr squid update <id> --param key=value
lobstr squid empty <id>
lobstr squid rm <id>
Tasks
lobstr task add <squid> <url1> <url2> ...
lobstr task upload <squid> <file.csv>
lobstr task ls <squid>
lobstr task rm <id>
Runs
lobstr run start <squid> --wait
lobstr run ls <squid>
lobstr run show <id>
lobstr run stats <id>
lobstr run abort <id>
lobstr run download <id> -o results.csv
Results
lobstr results get <squid> --format csv -o data.csv
Delivery
lobstr delivery email <squid> --email user@example.com
lobstr delivery webhook <squid> --url https://...
lobstr delivery s3 <squid> --bucket my-bucket