Documentation Index
Fetch the complete documentation index at: https://docs.tesslate.com/llms.txt
Use this file to discover all available pages before exploring further.

What is OpenSail?
OpenSail is an open platform for building, running, and sharing AI-powered software. Describe a job you want done, drop in a file, and OpenSail helps turn it into a working agent, app, or workflow. Agents write code, use connected tools, remember what they learn, and continue work across many steps inside sandboxed cloud environments. You can use OpenSail on the cloud at tesslate.com, install it as a native desktop app, or self-host the whole stack on your own Kubernetes cluster.Any model
Anthropic, OpenAI, Google, Meta, Mistral, Qwen, DeepSeek, xAI, local Ollama, and more. Bring your own key or use ours.
Snapshot filesystem
Fork a running workspace in seconds. Every project sits on a btrfs subvolume with instant snapshot, clone, and restore.
Your infrastructure
Desktop, Docker, or Kubernetes. On-prem, air-gapped, or any cloud. No per-seat pricing, no lock-in.
Publish as apps
Turn any workspace into a one-click installable app. Share privately, with a team, or to the public marketplace.
What you can build
OpenSail treats every shipped unit as an “app.” An app can be a UI, a chat interface, a scheduled cron job, a webhook handler, or an MCP tool callable by other agents. They are declared in the same manifest and built in the same workspace.Agents that actually work over time
Agents that actually work over time
Agents run in sandboxed containers, use connected tools (Slack, GitHub, databases, MCP servers), remember context across sessions, and continue work even when you close your laptop. Progressive persistence writes every step to the database as it happens, so you can resume mid-task.
Full-stack applications
Full-stack applications
Multi-container projects with frontend, backend, database, and workers. React, Next.js, Node, Python, Go, any framework. Each project gets its own isolated namespace and volume.
Scheduled automations
Scheduled automations
Agents that pull metrics every Friday, draft weekly reports, monitor support channels, route leads to CRM, or process files in a queue. Natural language schedules convert to cron expressions.
Chat and messaging bots
Chat and messaging bots
Deploy agents to Slack, Telegram, Discord, WhatsApp, or Signal. Agents pick up requests, run work in sandboxed environments, and reply on the platform.
MCP tools for other agents
MCP tools for other agents
Publish an agent as an MCP tool server so external coding agents (Claude Code, Cursor, Codex) can call it, get sandboxed compute, and use it as infrastructure.
How it works
Describe the job
Tell OpenSail what you want to build, or drop in a file. The agent plans the work, wires up the right tools, and asks clarifying questions.
Work in a real IDE
Monaco editor, a real terminal, a file tree, live preview with hot reload, git with branches and blame, a kanban board. The agent sees what you see.
Run in sandboxes
Every project runs in a three-tier compute model on Kubernetes. File ops and reasoning are near-free. Shell runs in warm pools. Full dev environments wake on demand and hibernate when idle.
Technology
| Layer | What |
|---|---|
| Frontend | React 19, TypeScript, Vite, Tailwind, Monaco Editor, XYFlow |
| Desktop shell | Tauri v2 (Rust), local SQLite, Stronghold token store |
| Backend | FastAPI, Python 3.11, SQLAlchemy, LiteLLM |
| Agent runner | tesslate-agent (Python), 33 built-in tools across 8 categories |
| Database | PostgreSQL (cloud), SQLite (desktop) |
| Task queue | Redis + ARQ (cloud), asyncio + apscheduler (desktop) |
| Storage | btrfs CSI driver, Volume Hub orchestrator, S3-backed CAS |
| Containers | Docker Compose (dev), Kubernetes (prod), subprocesses (desktop) |
| Routing | Traefik (Docker), NGINX Ingress (Kubernetes) |
Three ways to run
Cloud
Managed at tesslate.com. Sign up and build.
Desktop
Native Tauri app for macOS, Windows, Linux.
Self-hosted
Docker or Kubernetes on your own hardware.
Why open source
Workspace agents are powerful. They touch your data, your tools, your processes. You should be able to see exactly what they are doing, run them on your own infrastructure, and not be locked to a single model provider. OpenSail runs on any model, deploys on any cloud, and keeps your data where you put it.Next steps
Quickstart
Build your first agent in five minutes.
Install on Docker
Clone, configure, boot. Twenty minutes from zero to running.
Install on Desktop
macOS, Windows, Linux. No cluster needed.
Publish an app
Turn a workspace into a one-click installable app.