— Documentation

Build with Prompsy.

A quick tour of the platform. Five minutes from zero to your first orchestrated prompt.

1. Prompts

Every prompt has a body, optional {{variables}}, tags, and a recommended model. Versioning is automatic — every save creates a snapshot.

2. Bakeoffs

Run the same prompt across multiple models, then let an AI judge pick the winner against a rubric. Useful for picking the cheapest model that still meets quality.

3. Tune

Paste a prompt and Prompsy returns a quality score, a list of weaknesses, and a rewritten version. One-click apply.

4. Flows

A visual canvas for chaining prompts. Each node receives {{prev}} from upstream and any custom inputs you map. Run end-to-end with one button.

5. Snap & Mirror

Drop a screenshot of any LLM UI; Prompsy extracts the underlying prompt template. Or paste reference text and it produces a style-matched prompt.

6. Public API

Generate a personal access token in Settings → Tokens, then:

# List prompts
GET /api/public/v1/prompts
Authorization: Bearer prompsy_...

# Create a prompt
POST /api/public/v1/prompts
Authorization: Bearer prompsy_...
Content-Type: application/json

{ "title": "Email summary", "body": "Summarize: {{text}}" }

7. Chrome extension

Install Prompsy for Chrome and use /prompsy in any text field to insert prompts from your library. Sync is automatic over the public API.