Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.orca.so/llms.txt

Use this file to discover all available pages before exploring further.

LLM & AI Access

Orca documentation is structured for direct consumption by LLMs and AI tools. Whether you’re grounding a language model with protocol context or building an autonomous agent, the resources below give you everything you need.

MCP Server

Orca exposes a live Model Context Protocol (MCP) server — the fastest way to give any AI tool native access to Orca documentation.
https://docs.orca.so/mcp
The MCP server provides a SearchOrcaDocumentation tool that does semantic search across all docs and returns titles, content excerpts, and direct page links. No API key required.

Connect in Claude Desktop

Add to your claude_desktop_config.json:
{
  "mcpServers": {
    "orca-docs": {
      "url": "https://docs.orca.so/mcp"
    }
  }
}

Connect in Cursor

Add to your .cursor/mcp.json:
{
  "mcpServers": {
    "orca-docs": {
      "url": "https://docs.orca.so/mcp"
    }
  }
}

Connect in any MCP-compatible client

The server uses the standard MCP HTTP transport. Initialize with:
curl -s -X POST https://docs.orca.so/mcp \
  -H "Content-Type: application/json" \
  -d '{"jsonrpc":"2.0","method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"my-agent","version":"1.0"}},"id":1}'
Then call the search tool:
curl -s -X POST https://docs.orca.so/mcp \
  -H "Content-Type: application/json" \
  -d '{"jsonrpc":"2.0","method":"tools/call","params":{"name":"SearchOrcaDocumentation","arguments":{"query":"how to open a CLMM position"}},"id":2}'

Quick Access Files

FileURLDescription
llms.txtdocs.orca.so/llms.txtFull structured reference: protocol constants, key concepts, SDK function signatures, and page links — optimized for LLM context

Using with AI Tools

ChatGPT, Claude, and Other LLMs

Paste this prompt to load Orca context into any LLM:
Fetch https://docs.orca.so/llms.txt and use it to answer questions about Orca.
llms.txt includes protocol constants, fee tiers, SDK function signatures, price math formulas, and links to all documentation pages — everything an LLM needs to reason about Orca accurately.

Cursor, Windsurf, and AI Code Editors

Add the MCP server (above) for native tool-use, or reference llms.txt directly:
  1. Cursor: Add https://docs.orca.so/llms.txt as a doc source in Settings → Docs
  2. Windsurf: Include the URL in your workspace context
  3. VS Code Copilot: Reference the URL directly in your prompts

Programmatic Access

# Fetch the full structured reference
curl https://docs.orca.so/llms.txt

llms.txt Standard

Orca follows the llms.txt specification, an emerging open standard that makes documentation natively accessible to AI systems — analogous to robots.txt for search crawlers. The format provides:
  • Protocol constants — program IDs, config addresses, fee tiers, error codes
  • Key concepts — tick math, sqrtPrice formulas, position lifecycle
  • SDK function signatures — ready-to-use TypeScript Kit references
  • Organized page links — every documentation page with a one-line description

Best Practices

When using Orca docs with AI tools:
  1. Ground with llms.txt first — load it as context before asking protocol-specific questions
  2. Reference specific pages — for focused queries, link directly to the relevant documentation page
  3. Use the REST API for live datallms.txt contains static reference; fetch https://api.orca.so/v2/solana/pools/search?q=SOL-USDC for real-time pool state
The documentation is continuously updated. AI tools accessing these URLs will always receive the latest content.

Building an autonomous agent on Orca? See AI Agents on Orca for a complete guide.