By Casey Smith, Payabli Docs
Key Takeaways:
- AI coding assistants can hallucinate payment API endpoints when they lack grounded context; Payabli offers several tools (MCP server, markdown docs, llms.txt indexes, OpenAPI spec, server SDKs) to fix that.
- The MCP server takes about 30 seconds to set up in Cursor or other third-party tools and covers most integration questions.
- Section-level llms.txt endpoints let you load targeted context without burning your entire context window.
- Project instructions files (CLAUDE.md, AGENTS.md) give your assistant a persistent context across every conversation.
If you’re using Cursor, Claude Code, Windsurf, or another AI coding assistant to build your Payabli integration, you’ve probably run into this: you ask your assistant how to make a transaction, and it generates plausible-looking code that calls endpoints that don’t exist, or uses parameter names it invented. This isn’t a sign that your assistant is bad at code, it’s just a sign that it doesn’t have a real, grounded context about Payabli’s APIs.
This post is about fixing that. We have multiple tools for you to give your AI assistant accurate, up-to-date Payabli context: an MCP server, per-page markdown, server SDKs, section-level LLM indexes, and a full OpenAPI spec. Here’s when to use each one.
How to set up the Payabli MCP server
MCP (Model Context Protocol) is a standard that lets your AI tools query external knowledge sources directly from your IDE. Instead of searching on your own and then copying and pasting docs into the chat, your assistant can search them on demand.
The Payabli MCP server gives your assistant two tools:
- search-payabli-docs: searches official Payabli documentation
- ask-question-about-payabli: searches the SDK repos and other resources
Setting up in Cursor takes about 30 seconds. Create or open .cursor/mcp.json in your project root and add:
{
"mcpServers": {
"inkeepMcp": {
"url": "https://mcp.inkeep.com/payabli/mcp"
}
}
}
Restart Cursor and you’re done. For Windsurf and Claude, see the full setup guide. It’s the same idea, just different config file locations.
Once it’s running, you can ask your assistant things like “how do I make a sale transaction” or “what does the boarding flow look like” and it will search the docs rather than guess.
How to load a Payabli docs page into your AI assistant
Sometimes you want to point your assistant at one specific doc page (the webhook reference, the API overview, a particular guide) without loading everything. Append .md to any URL on docs.payabli.com to get a clean markdown version with no navigation or JavaScript overhead:
https://docs.payabli.com/guides/pay-in/transactions ← full page
https://docs.payabli.com/guides/pay-in/transactions.md ← markdown only
You can paste that markdown URL directly into your assistant’s context, or use it with a fetch tool if your IDE supports that. It works on every page across the site.
As we author our documentation, we tag content that doesn’t translate well into an AI context like React-based decision trees, complicated SVG diagrams, visual and interactive elements optimized for human readers. This tagging excludes the content from the markdown versions. We also add plain-text equivalents for anything important that would otherwise be lost. The result is that markdown versions use less of your context window and contain content that’s actually meant for an AI to read.
Which Payabli llms.txt file should you use?
When you need your assistant to understand the Payabli landscape (general concepts, integration workflows, which API endpoints exist, and other resources), use LLM-optimized indexes at the site level and per section. The site-wide llms.txt (~112 KB) is a structured directory of every page with brief descriptions, useful for giving your assistant a map of what exists.
If you want to load actual content at the site level, llms-full.txt has everything in one file, though at ~7.5 MB it’s heavy and will eat a significant chunk of your context window.For actual content, the section-level endpoints are usually the right call. They’re smaller and more targeted:
| What you’re working on | Endpoint |
| API reference | /developers/api-reference/llms-full.txt (~40 KB index) |
| Developer tools + SDKs | /developers/llms-full.txt (~5.3 MB, or ~2.2 MB without the spec) |
| Guides only | /guides/llms-full.txt |
| Cookbooks | /cookbooks/llms-full.txt (~27 KB) |
If you’re only working in one language, the developer tools endpoints accept a lang query parameter so you’re not burning tokens on SDK examples in languages you don’t use:
None
https://docs.payabli.com/developers/llms-full.txt?lang=python
Which Payabli server SDK should you install for AI context?
We publish official server SDKs in TypeScript/Node, Python, C#, Java, Go, Ruby, PHP, and Rust, all generated from the OpenAPI spec so they stay in sync with the API. Once the SDK is installed in your project, your AI assistant can read the types and method signatures directly from your dependencies. That’s usually enough to stop it from generating code that doesn’t match our API. See the server SDK overview to find your language.
How do you use the Payabli OpenAPI spec?
If you’re doing API-heavy work and want your assistant to have complete, authoritative knowledge of the Payabli API, drop in the full OpenAPI specification. At ~1.3 MB, it’s a YAML file covering every endpoint, request parameter, and response schema.
You can download it directly or reference it by URL in your project. Most AI coding tools can load it as a file or fetch it directly. It also works with API clients like Postman and Insomnia if you want to explore endpoints outside of your IDE.
How do you set up project instructions for AI coding tools?
This one is worth spending a few minutes on. Most AI coding tools support a project-level instructions file that gets loaded automatically at the start of every conversation. The filename varies by tool CLAUDE.md for Claude Code, AGENTS.md for Cursor and Codex, .github/copilot-instructions.md for Copilot), but the idea is the same: you write it once, and your assistant always knows the basics about your integration.
Your repo-level file doesn’t need to cover everything. A good pattern is to keep it high-level and point to more detailed instructions for specific parts of your codebase. For example, if you have a payments submodule with its own conventions, you can maintain a separate instructions file there and just reference it from the root:
# Payabli integration context
- Environment: sandbox
- orgId: [your-test-org-id]
- SDK: TypeScript
- Error handling: we wrap all Payabli errors in our ApiError class (see src/errors.ts)
- Webhook validation: see src/webhooks/verify.ts
- Payments submodule: see src/payments/AGENTS.md for detailed context
One important caveat: don’t put secrets in this file. API tokens, credentials, and anything sensitive should stay in environment variables or a secrets manager. The instructions file is for context and conventions, not credentials, and it’ll likely end up committed to your repo.
Check your tool’s docs for the exact filename. The content is the same regardless.
Which AI context tool is right for your Payabli integration?
Not sure where to start? Use this as a quick reference:
| Situation | What to use |
| Starting a new integration | MCP server + SDK |
| Need accurate method names and parameters | SDK + OpenAPI spec |
| Need details on one specific page | .md suffix |
| The assistant needs to understand the doc landscape | llms.txt index |
| Loading content for a specific section | Section-level llms-full.txt |
| Persistent project context | Instructions file in your repo |
Start with the MCP server and the SDK for your language. That combination covers most questions your assistant will run into during a typical integration. Add the others as your work gets more specific: a particular guide page, a section-level index for heavy work that needs more context, a project instructions file once your team has patterns worth preserving.
We want Payabli to be the easiest payments company to integrate with, so we made these tools available because AI assistants work better with real context. The better the context, the better your AI agents understand Payabli, and the faster you can integrate.
Learn about our developer tools and agent resources in the Payabli Docs.

