Documentation Index
Fetch the complete documentation index at: https://docs.orca.so/llms.txt
Use this file to discover all available pages before exploring further.
LLM & AI Access
Orca documentation is structured for direct consumption by LLMs and AI tools. Whether you’re grounding a language model with protocol context or building an autonomous agent, the resources below give you everything you need.MCP Server
Orca exposes a live Model Context Protocol (MCP) server — the fastest way to give any AI tool native access to Orca documentation.SearchOrcaDocumentation tool that does semantic search across all docs and returns titles, content excerpts, and direct page links. No API key required.
Connect in Claude Desktop
Add to yourclaude_desktop_config.json:
Connect in Cursor
Add to your.cursor/mcp.json:
Connect in any MCP-compatible client
The server uses the standard MCP HTTP transport. Initialize with:Quick Access Files
| File | URL | Description |
|---|---|---|
llms.txt | docs.orca.so/llms.txt | Full structured reference: protocol constants, key concepts, SDK function signatures, and page links — optimized for LLM context |
Using with AI Tools
ChatGPT, Claude, and Other LLMs
Paste this prompt to load Orca context into any LLM:llms.txt includes protocol constants, fee tiers, SDK function signatures, price math formulas, and links to all documentation pages — everything an LLM needs to reason about Orca accurately.
Cursor, Windsurf, and AI Code Editors
Add the MCP server (above) for native tool-use, or referencellms.txt directly:
- Cursor: Add
https://docs.orca.so/llms.txtas a doc source in Settings → Docs - Windsurf: Include the URL in your workspace context
- VS Code Copilot: Reference the URL directly in your prompts
Programmatic Access
llms.txt Standard
Orca follows the llms.txt specification, an emerging open standard that makes documentation natively accessible to AI systems — analogous torobots.txt for search crawlers. The format provides:
- Protocol constants — program IDs, config addresses, fee tiers, error codes
- Key concepts — tick math, sqrtPrice formulas, position lifecycle
- SDK function signatures — ready-to-use TypeScript Kit references
- Organized page links — every documentation page with a one-line description
Best Practices
When using Orca docs with AI tools:- Ground with
llms.txtfirst — load it as context before asking protocol-specific questions - Reference specific pages — for focused queries, link directly to the relevant documentation page
- Use the REST API for live data —
llms.txtcontains static reference; fetchhttps://api.orca.so/v2/solana/pools/search?q=SOL-USDCfor real-time pool state
The documentation is continuously updated. AI tools accessing these URLs will always receive the latest content.
Building an autonomous agent on Orca? See AI Agents on Orca for a complete guide.
