Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.tesslate.com/llms.txt

Use this file to discover all available pages before exploring further.

Tesslate OpenSail

What is OpenSail?

OpenSail is an open platform for building, running, and sharing AI-powered software. Describe a job you want done, drop in a file, and OpenSail helps turn it into a working agent, app, or workflow. Agents write code, use connected tools, remember what they learn, and continue work across many steps inside sandboxed cloud environments. You can use OpenSail on the cloud at tesslate.com, install it as a native desktop app, or self-host the whole stack on your own Kubernetes cluster.

Any model

Anthropic, OpenAI, Google, Meta, Mistral, Qwen, DeepSeek, xAI, local Ollama, and more. Bring your own key or use ours.

Snapshot filesystem

Fork a running workspace in seconds. Every project sits on a btrfs subvolume with instant snapshot, clone, and restore.

Your infrastructure

Desktop, Docker, or Kubernetes. On-prem, air-gapped, or any cloud. No per-seat pricing, no lock-in.

Publish as apps

Turn any workspace into a one-click installable app. Share privately, with a team, or to the public marketplace.

What you can build

OpenSail treats every shipped unit as an “app.” An app can be a UI, a chat interface, a scheduled cron job, a webhook handler, or an MCP tool callable by other agents. They are declared in the same manifest and built in the same workspace.
Agents run in sandboxed containers, use connected tools (Slack, GitHub, databases, MCP servers), remember context across sessions, and continue work even when you close your laptop. Progressive persistence writes every step to the database as it happens, so you can resume mid-task.
Multi-container projects with frontend, backend, database, and workers. React, Next.js, Node, Python, Go, any framework. Each project gets its own isolated namespace and volume.
Agents that pull metrics every Friday, draft weekly reports, monitor support channels, route leads to CRM, or process files in a queue. Natural language schedules convert to cron expressions.
Deploy agents to Slack, Telegram, Discord, WhatsApp, or Signal. Agents pick up requests, run work in sandboxed environments, and reply on the platform.
Publish an agent as an MCP tool server so external coding agents (Claude Code, Cursor, Codex) can call it, get sandboxed compute, and use it as infrastructure.

How it works

1

Describe the job

Tell OpenSail what you want to build, or drop in a file. The agent plans the work, wires up the right tools, and asks clarifying questions.
2

Work in a real IDE

Monaco editor, a real terminal, a file tree, live preview with hot reload, git with branches and blame, a kanban board. The agent sees what you see.
3

Run in sandboxes

Every project runs in a three-tier compute model on Kubernetes. File ops and reasoning are near-free. Shell runs in warm pools. Full dev environments wake on demand and hibernate when idle.
4

Ship anywhere

Draw an edge from a container to a deploy target. 22 providers supported: Vercel, AWS, Cloudflare Pages, Fly.io, and many more. Or publish as a one-click installable app.

Technology

LayerWhat
FrontendReact 19, TypeScript, Vite, Tailwind, Monaco Editor, XYFlow
Desktop shellTauri v2 (Rust), local SQLite, Stronghold token store
BackendFastAPI, Python 3.11, SQLAlchemy, LiteLLM
Agent runnertesslate-agent (Python), 33 built-in tools across 8 categories
DatabasePostgreSQL (cloud), SQLite (desktop)
Task queueRedis + ARQ (cloud), asyncio + apscheduler (desktop)
Storagebtrfs CSI driver, Volume Hub orchestrator, S3-backed CAS
ContainersDocker Compose (dev), Kubernetes (prod), subprocesses (desktop)
RoutingTraefik (Docker), NGINX Ingress (Kubernetes)

Three ways to run

Cloud

Managed at tesslate.com. Sign up and build.

Desktop

Native Tauri app for macOS, Windows, Linux.

Self-hosted

Docker or Kubernetes on your own hardware.

Why open source

Workspace agents are powerful. They touch your data, your tools, your processes. You should be able to see exactly what they are doing, run them on your own infrastructure, and not be locked to a single model provider. OpenSail runs on any model, deploys on any cloud, and keeps your data where you put it.

Next steps

Quickstart

Build your first agent in five minutes.

Install on Docker

Clone, configure, boot. Twenty minutes from zero to running.

Install on Desktop

macOS, Windows, Linux. No cluster needed.

Publish an app

Turn a workspace into a one-click installable app.