Before setting up your development environment, ensure you have the following installed:
| Requirement | Minimum Version | Check Command | Notes |
|---|---|---|---|
| Python | 3.12+ | python3 --version |
Required for running the application natively |
| uv | Latest | uv --version |
Python package manager (replaces pip) |
| PostgreSQL | 12+ | psql --version |
Required for integration tests and local development |
| Docker | 20.10+ | docker --version |
Required for containerized development and E2E tests |
| Docker Compose | 2.0+ | docker compose version |
Multi-container orchestration |
| git | Any recent | git --version |
Source control |
uv. Install it with curl -LsSf https://astral.sh/uv/install.sh | sh or brew install uv on macOS.
git clone https://github.com/prebid/salesagent.git
cd salesagent
Use uv to install all project dependencies, including development tools:
uv sync
This reads pyproject.toml and installs all runtime and development dependencies into a virtual environment managed by uv. Key dependencies include:
pyproject.toml, run uv sync again to update the virtual environment. Never use pip install directly.
If you prefer running PostgreSQL directly on your host (instead of Docker), create the development database:
# Start PostgreSQL (macOS with Homebrew)
brew services start postgresql@16
# Create the database
createdb salesagent_dev
# Optionally create a test database for integration tests
createdb salesagent_test
Apply all database schema migrations using Alembic:
uv run alembic upgrade head
This runs 156 migration files that build the complete schema, including tables for tenants, principals, products, media buys, creatives, workflow steps, audit logs, and more.
db-init service). You only need to run them manually when developing without Docker or after pulling new migration files.
Create a .env file in the project root for local development. The application reads environment variables from this file on startup:
# Database
DATABASE_URL=postgresql://localhost:5432/salesagent_dev
# Auth (test mode for local development)
ADCP_AUTH_TEST_MODE=true
# Super admin access for the Admin UI
SUPER_ADMIN_EMAILS=dev@example.com
# Optional: AI features (provide your own API key)
# AI_PROVIDER=google
# AI_MODEL=gemini-2.0-flash
# AI_API_KEY=your-api-key-here
# Optional: Skip migrations on startup (useful when running outside Docker)
# SKIP_MIGRATIONS=true
# Optional: Multi-tenant mode
# ADCP_MULTI_TENANT=false
.env file is listed in .gitignore and should never be committed. Each developer maintains their own local configuration.
Run all services directly using the deployment script:
uv run python scripts/deploy/run_all_services.py
This starts the unified FastAPI application on port 8080 with all sub-applications mounted:
/mcp/ – FastMCP Server (StreamableHTTP transport)/a2a – A2A Server (JSON-RPC 2.0)/admin – Flask Admin UI/api/v1 – REST API/ – Tenant Landing PagesFor a fully containerized environment that includes nginx and PostgreSQL:
docker compose up -d
This starts all services behind an nginx reverse proxy on port 8000. Hot-reload is supported via volume mounts – code changes in src/ are reflected without rebuilding the container.
| Mode | Command | URL | Use Case |
|---|---|---|---|
| Native | uv run python scripts/deploy/run_all_services.py |
http://localhost:8080 |
Fast iteration, debugger support |
| Docker | docker compose up -d |
http://localhost:8000 |
Full stack with nginx, PostgreSQL |
# Start all containers in the background
docker compose up -d
# Watch logs in real time
docker compose logs -f
The Docker Compose configuration mounts the src/ directory as a volume, so code changes are picked up automatically by the uvicorn server running inside the container. You do not need to rebuild the container for Python code changes.
To rebuild after changes to Dockerfile, pyproject.toml, or system dependencies:
docker compose up -d --build
| Action | Command |
|---|---|
| Start all services | docker compose up -d |
| Stop all services | docker compose down |
| Stop and remove volumes (clean database) | docker compose down -v |
| Rebuild containers | docker compose up -d --build |
| View all logs | docker compose logs -f |
| View app logs only | docker compose logs -f salesagent |
| Open a shell in the container | docker compose exec salesagent bash |
| Run migrations manually | docker compose exec salesagent alembic upgrade head |
| Check container status | docker compose ps |
Create or update .vscode/settings.json in the project root:
{
"python.defaultInterpreterPath": ".venv/bin/python",
"python.analysis.typeCheckingMode": "basic",
"python.analysis.autoImportCompletions": true,
"[python]": {
"editor.defaultFormatter": "charliermarsh.ruff",
"editor.formatOnSave": true,
"editor.codeActionsOnSave": {
"source.fixAll.ruff": "explicit",
"source.organizeImports.ruff": "explicit"
}
},
"python.testing.pytestEnabled": true,
"python.testing.pytestArgs": [
"tests/unit"
]
}
| Extension | Purpose |
|---|---|
| Ruff | Linting and formatting (replaces Black, Flake8, isort) |
| Pylance | Type checking and IntelliSense |
| Python | Python language support |
| Docker | Docker and Compose file support |
| SQLAlchemy Stubs | SQLAlchemy type hints |
uv-managed virtual environment at .venv/bin/python.| Task | Command |
|---|---|
| Run the server (native) | uv run python scripts/deploy/run_all_services.py |
| Run the server (Docker) | docker compose up -d |
| Run all unit tests | uv run pytest tests/unit |
| Run integration tests | uv run pytest tests/integration |
| Run E2E tests | uv run pytest tests/e2e |
| Run a specific test file | uv run pytest tests/unit/test_products.py |
| Run tests with coverage | uv run pytest tests/unit --cov=src --cov-report=html |
| Create a new migration | uv run alembic revision --autogenerate -m "description" |
| Apply all migrations | uv run alembic upgrade head |
| Rollback one migration | uv run alembic downgrade -1 |
| Lint code | uv run ruff check src/ tests/ |
| Auto-fix lint issues | uv run ruff check --fix src/ tests/ |
| Format code | uv run ruff format src/ tests/ |
| Type check | uv run mypy src/ |
| Run pre-commit hooks | uv run pre-commit run --all-files |
| Install pre-commit hooks | uv run pre-commit install |
Understanding these core patterns will help you navigate and contribute to the codebase effectively.
_impl Functions)All business logic lives in _impl functions that are transport-agnostic. Each tool has three layers:
# Layer 1: MCP wrapper (src/core/tools/)
@mcp.tool()
def get_products(brief: str = "", ctx: Context = None) -> ToolResult:
identity = resolve_identity_from_context(ctx)
return get_products_raw(brief=brief, identity=identity)
# Layer 2: Raw function (callable from any transport)
def get_products_raw(brief: str = "", identity: ResolvedIdentity = None) -> GetProductsResponse:
return _get_products_impl(req=GetProductsRequest(brief=brief), identity=identity)
# Layer 3: Implementation (pure business logic)
def _get_products_impl(req: GetProductsRequest, identity: ResolvedIdentity) -> GetProductsResponse:
# Database queries, adapter calls, AI ranking — no transport awareness
...
Rules enforced by structural test guards:
_impl functions must never import transport-specific types_impl functions must take ResolvedIdentity, not Context or ToolContext_impl functions must raise AdCPError, never ToolError_implDatabase access uses a context manager that ensures proper transaction handling:
from src.core.database.database_session import get_db_session
with get_db_session() as session:
product = session.query(Product).filter_by(
tenant_id=identity.tenant_id,
product_id=product_id
).first()
session.commit() # Explicit commit required
All queries are tenant-scoped via composite primary keys (tenant_id, entity_id) to enforce multi-tenant isolation.
The codebase uses a hierarchical error system with recovery classification:
from src.core.exceptions import AdCPValidationError, AdCPNotFoundError
# Raise with recovery hint for the calling agent
raise AdCPValidationError(
message="Budget exceeds tenant maximum",
recovery="correctable", # "terminal", "correctable", or "transient"
details={"max_budget": 50000, "requested": 75000}
)
Recovery classifications:
terminal — Cannot be retried (e.g., authentication failure, authorization denied)correctable — Agent should modify the request and retry (e.g., validation error, policy violation)transient — Temporary failure, safe to retry with backoff (e.g., ad server timeout, rate limit)Tools delegate ad server operations through the adapter interface:
adapter = get_adapter_for_tenant(identity.tenant_id) # Returns GAM, Kevel, Mock, etc.
response = adapter.create_media_buy(request, packages, start_time, end_time)
The adapter is resolved per-tenant from the adapter_config table. See Building a Custom Adapter for the full interface.
AI features use Pydantic AI agents with per-tenant model configuration:
from src.services.ai.factory import AIServiceFactory
# Factory creates model instance from tenant's ai_config
model = AIServiceFactory.create_model(tenant.ai_config)
agent = Agent(model=model, system_prompt="...")
result = await agent.run(prompt)
Supported providers: Google Gemini (default), OpenAI, Anthropic Claude, Groq, and AWS Bedrock.
uv sync Fails with Compilation ErrorsSome dependencies (e.g., psycopg2-binary, pillow) include C extensions. Ensure you have the necessary system libraries:
macOS:
brew install postgresql libpq openssl
Ubuntu/Debian:
sudo apt-get install libpq-dev python3-dev build-essential
If you see connection refused errors when starting the application:
pg_isready (native) or docker compose ps (Docker).DATABASE_URL in your .env matches your PostgreSQL setup.psql -l | grep salesagent_dev.This occurs when the database schema is ahead of or behind the migration history:
# Check current migration state
uv run alembic current
# Apply all pending migrations
uv run alembic upgrade head
If migrations are irrecoverably broken during development, reset the database:
dropdb salesagent_dev && createdb salesagent_dev && uv run alembic upgrade head
If pre-commit hooks fail:
# See what failed
uv run pre-commit run --all-files
# Auto-fix formatting issues
uv run ruff format src/ tests/
uv run ruff check --fix src/ tests/
If port 8080 (native) or 8000 (Docker) is already in use:
PORT environment variable: PORT=9090 uv run python scripts/deploy/run_all_services.pydocker-compose.yml: "9000:8000"