Skip to content

Commit eeb6716

Browse files
aadamsxclaude
andcommitted
docs: add export workflows documentation to README
- Document "Export as Service" feature in main README - Explain WORKSPACE_DIR for local file operations - Document MCP filesystem tool support - List supported features and security measures 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
1 parent 12e86ae commit eeb6716

File tree

1 file changed

+50
-0
lines changed

1 file changed

+50
-0
lines changed

README.md

Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -262,6 +262,56 @@ If ports 3000, 3002, or 5432 are in use, configure alternatives:
262262
NEXT_PUBLIC_APP_URL=http://localhost:3100 POSTGRES_PORT=5433 docker compose up -d
263263
```
264264

265+
## Export Workflows as Standalone Services
266+
267+
Export any workflow as a self-contained Python/FastAPI service that can be deployed independently.
268+
269+
### Quick Start
270+
271+
1. Right-click a workflow in the sidebar
272+
2. Select **"Export as Service"**
273+
3. Extract the ZIP file
274+
4. Configure `.env` with your API keys
275+
5. Run with Docker or directly:
276+
277+
```bash
278+
# With Docker
279+
docker compose up -d
280+
281+
# Or directly
282+
pip install -r requirements.txt
283+
uvicorn main:app --port 8080
284+
```
285+
286+
### File Operations
287+
288+
Exported services support two modes for agent file operations:
289+
290+
#### Local File Tools (Recommended for Simple Use Cases)
291+
292+
Set `WORKSPACE_DIR` in `.env` to enable sandboxed local file operations:
293+
294+
```bash
295+
WORKSPACE_DIR=./workspace
296+
```
297+
298+
Agents automatically get `local_write_file`, `local_read_file`, and `local_list_directory` tools. All paths are sandboxed—agents cannot access files outside the workspace.
299+
300+
With Docker, files appear in `./output/` on your host machine.
301+
302+
#### MCP Filesystem Tools
303+
304+
If your workflow uses MCP filesystem servers, those continue to work as configured. MCP servers handle file operations on their own systems.
305+
306+
You can use both options together—the LLM chooses the appropriate tool based on context.
307+
308+
### Supported Features
309+
310+
- **Multi-provider LLM support**: Anthropic (Claude), OpenAI (GPT), Google (Gemini)
311+
- **All block types**: Agent, Function, Condition, Router, API, Loop, Variables, Response
312+
- **MCP tools**: Full support via official Python SDK
313+
- **Security**: No eval(), no shell=True, sandboxed file operations
314+
265315
## Tech Stack
266316

267317
- **Framework**: [Next.js](https://nextjs.org/) (App Router)

0 commit comments

Comments
 (0)