Skip to main content

Configuration File

All configuration is done through the .env file in the root directory. Start by copying the example:
cp .env.example .env

Required Variables

These variables must be set for Tesslate Studio to function.

SECRET_KEY

SECRET_KEY
string
required
Secret key for JWT token signing and session encryption.
Generate:
python -c "import secrets; print(secrets.token_urlsafe(32))"
Example:
SECRET_KEY=your-generated-secret-key-here

LITELLM_MASTER_KEY

LITELLM_MASTER_KEY
string
required
Master key for LiteLLM proxy authentication.
Generate:
python -c "import secrets; print('sk-' + secrets.token_urlsafe(32))"
Example:
LITELLM_MASTER_KEY=sk-your-litellm-master-key

AI Provider API Keys

At least one AI provider API key is required for Tesslate Studio to generate code.
  • OpenAI
  • Anthropic
  • Google
  • Azure OpenAI
OPENAI_API_KEY=sk-your-openai-key
Supports GPT-5, GPT-4, and GPT-3.5 models.

Optional Variables

Application Settings

APP_DOMAIN
string
default:"studio.localhost"
Domain where Tesslate Studio is hosted.
# Local development
APP_DOMAIN=studio.localhost

# Production
APP_DOMAIN=studio.yourcompany.com
APP_PROTOCOL
string
default:"http"
Protocol for accessing the application. Options: http, https
# Local
APP_PROTOCOL=http

# Production with SSL
APP_PROTOCOL=https
FRONTEND_URL
string
default:"http://studio.localhost"
Full URL where the frontend is accessible.
FRONTEND_URL=https://studio.yourcompany.com

Database Configuration

DATABASE_URL
string
PostgreSQL connection string in asyncpg format.
# Docker default (no change needed)
DATABASE_URL=postgresql+asyncpg://tesslate:tesslate_password@postgres:5432/tesslate_db

# External managed database
DATABASE_URL=postgresql+asyncpg://admin:[email protected]:5432/tesslate
POSTGRES_USER
string
default:"tesslate"
PostgreSQL username (Docker only).
POSTGRES_PASSWORD
string
default:"tesslate_password"
PostgreSQL password (Docker only).
Change the default password in production environments!
POSTGRES_DB
string
default:"tesslate_db"
PostgreSQL database name (Docker only).

LiteLLM Configuration

LITELLM_DEFAULT_MODELS
string
default:"gpt-5o-mini,claude-3-haiku,gemini-pro"
Comma-separated list of default AI models.
# Use only GPT-5 models
LITELLM_DEFAULT_MODELS=gpt-5o,gpt-5o-mini

# Mix of providers
LITELLM_DEFAULT_MODELS=gpt-5o-mini,claude-3-haiku,gemini-flash
LITELLM_INITIAL_BUDGET
number
default:"10.0"
Initial API budget per user in USD.
# $50 initial budget
LITELLM_INITIAL_BUDGET=50.0
LITELLM_PROXY_URL
string
default:"http://litellm:4000"
LiteLLM proxy endpoint URL.
# External LiteLLM instance
LITELLM_PROXY_URL=http://litellm.yourcompany.com:4000

Container Runtime

CONTAINER_MODE
string
default:"docker"
Container orchestration system to use. Options: docker
DOCKER_SOCKET_PATH
string
Path to Docker socket for container management.Default: /var/run/docker.sock (Linux/Mac), //./pipe/docker_engine (Windows)
# Linux/Mac
DOCKER_SOCKET_PATH=/var/run/docker.sock

# Windows
DOCKER_SOCKET_PATH=//./pipe/docker_engine

Auto-Seeding

AUTO_SEED_DATABASE
boolean
default:"true"
Automatically seed database with agents and templates on startup.
# Enable auto-seeding (recommended)
AUTO_SEED_DATABASE=true

# Disable for manual control
AUTO_SEED_DATABASE=false
What gets seeded:
  • 4 marketplace agents (Stream Builder, Full Stack Agent, etc.)
  • 3 project templates (Next.js, Vite+React+FastAPI, Vite+React+Go)
  • 6 open-source customizable agents

GitHub Integration

GITHUB_CLIENT_ID
string
GitHub OAuth app client ID. Required for GitHub integration.
GITHUB_CLIENT_SECRET
string
GitHub OAuth app client secret.
GITHUB_CLIENT_ID=your-github-client-id
GITHUB_CLIENT_SECRET=your-github-client-secret

Logging

LOG_LEVEL
string
default:"INFO"
Application logging level. Options: DEBUG, INFO, WARNING, ERROR, CRITICAL
# Development
LOG_LEVEL=DEBUG

# Production
LOG_LEVEL=INFO

Production Configuration

Recommended settings for production deployments:
# Security
SECRET_KEY=<strong-random-key>
LITELLM_MASTER_KEY=sk-<strong-random-key>

# Domain
APP_DOMAIN=studio.yourcompany.com
APP_PROTOCOL=https
FRONTEND_URL=https://studio.yourcompany.com

# Database (use managed service)
DATABASE_URL=postgresql+asyncpg://user:[email protected]:5432/tesslate

# Logging
LOG_LEVEL=INFO

# Auto-seed (optional in production)
AUTO_SEED_DATABASE=false

Local AI Models (Ollama)

To use local AI models with Ollama:
1

Install Ollama

Download from ollama.ai/download
2

Pull a model

ollama pull llama2
3

Configure LiteLLM

LITELLM_DEFAULT_MODELS=ollama/llama2

Configuration Examples

SECRET_KEY=<generated>
LITELLM_MASTER_KEY=sk-<generated>
# No paid API keys - use local models
LITELLM_DEFAULT_MODELS=ollama/llama2

Troubleshooting

Problem: SECRET_KEY is missing or invalid.Solution: Generate a new key and update .env:
python -c "import secrets; print(secrets.token_urlsafe(32))"
Problem: PostgreSQL is not accessible.Solution:
# Check if postgres container is running
docker compose ps postgres

# Restart database
docker compose restart postgres

# Check logs
docker compose logs postgres
Problem: Missing or invalid AI provider API key.Solution:
  1. Verify key in .env has no extra spaces
  2. Test key with provider directly
  3. Restart orchestrator:
docker compose restart orchestrator

Next Steps