How it works
PromptRelay coordinates AI execution between maintainers and volunteers. The platform never touches credentials, code, or compute. It only routes tasks and collects results.
1. Volunteer installs the CLI daemon
A volunteer runs npx @promptrelay/volunteer to authenticate via GitHub OAuth and start a background daemon. The daemon polls Convex for queued tasks matching the volunteer's allowed categories (docs, tests, bugfix, review, refactor, translation). Configuration lives in ~/.config/promptrelay-volunteer, including max tasks per day, trusted projects, and enabled providers.
2. Maintainer creates a task
Through the web dashboard, a maintainer links a GitHub repo, writes a prompt, and selects a category and output type (answer, review, markdown, diff, or PR draft). The task enters the queue with status queued. Maintainers can optionally specify a preferred provider or model.
3. Daemon claims and executes
When the daemon finds an eligible task, it claims it (status moves to claimed, then running). Under the hood, the executor spawns claude -p <prompt> --output-format text --dangerously-skip-permissions (or the Codex equivalent: codex --quiet --approval-mode full-auto <prompt>) as a child process inside the cloned repo. Claude Code gets a system prompt with the project name and task category, reads the codebase, and makes real file changes. Codex runs in full-auto mode with an equivalent prompt. If the task has a publicRepoUrl, the daemon clones or pulls the repo into ~/.promptrelay/repos/ and creates a working branch promptrelay/<task-id> before execution. Streaming output from Claude Code is pushed back to the maintainer in real time every 500ms via tasks:updateStream.
4. Result submitted for review
On completion, the daemon writes the result back through tasks:complete with the output content, provider used, model, and execution duration. The maintainer reviews the result and accepts or rejects it. For diff/PR outputs, the daemon can push a branch and open a pull request via the GitHub CLI automatically.
5. What stays local
API keys, model access, and compute are the volunteer's. The platform stores task metadata, prompts, and results in Convex. No credentials cross the network. The volunteer's daemon controls what it runs: category filters, daily limits, manual approval mode, and a trusted-projects allowlist.