Build on Juspay with LLMs
Use large language models (LLMs) to build Juspay integrations faster. Access our tools and best practices for AI-assisted development. Leverage pre-trained models, API integrations, and intelligent automation to streamline your payment workflows and reduce development time.
AI Assistance with Juspay Genius
Juspay provides Genius, an AI-powered assistant that helps developers find answers and understand integrations faster.
Genius allows users to ask questions directly within the documentation and receive responses based on relevant Juspay documentation and integration guides. This helps developers quickly resolve doubts without needing to manually search across multiple pages.
With Juspay Genius, users can:
Ask questions related to Juspay integrations.
Get answers based on the latest Juspay documentation.
Discover relevant documentation pages and integration guides.
Quickly resolve common implementation issues.
Users can access Genius directly from the documentation and interact with it by asking natural language questions about integrations, APIs, SDKs, or troubleshooting steps.
Juspay MCP Tools for AI Agents
Juspay provides MCP (Model Context Protocol) tools that allow AI agents to discover and retrieve relevant documentation programmatically. These tools help AI systems fetch the correct documentation sources and return structured Markdown content for better reasoning.
The MCP tools are available at: https://github.com/juspay/juspay-mcp/tree/main/juspay_docs_mcp
AI agents can use the following tools to discover and retrieve documentation content -
1. list_doc_sources
The list_doc_sources tool lists available documentation sources for Juspay integrations. It returns URLs to the relevant llms.txt files for different products such as Payment Page, Express Checkout, and API documentation.
Required parameters
platform— Platform used for the integration
(android,ios,react_native,web)client_idmerchant_idintegration_type
(payment-page-cat,payment-page-signature,express-checkout,api)
Optional parameters
ec_flow— Used when the integration type isexpress-checkout.
This tool helps AI agents determine which documentation sources should be used for a specific integration.
2. fetch_docs
The fetch_docs tool retrieves documentation content from a given URL and returns it as clean, unmodified Markdown. AI agents can use this tool to:
Retrieve the contents of an
llms.txtfile to discover documentation pages.Fetch specific documentation pages for deeper context.
The tool accepts any documentation URL returned by the list_doc_sources tool.
Plain Text Documentation
You can access Juspay documentation as plain text Markdown files by adding .md to the end of any documentation URL. For example, the plain text version of this page can be accessed at https://juspay.io/in/docs/resources/docs/llm-guide/build-with-llm.md
This allows AI tools and agents to easily consume our documentation and makes it simple to copy and paste the complete contents of a page into an LLM.
Using the Markdown version of the documentation is preferable to scraping or copying from HTML-rendered pages because:
Plain text contains fewer formatting tokens, making it easier for LLMs to process.
Markdown preserves the document hierarchy, helping LLMs better understand the structure and relationships between sections.
The content can be copied directly into prompts or AI tools without additional formatting cleanup.
Juspay provides an llms.txt file at the root level (juspay.io/in/docs/llms.txt) to help AI tools and agents easily discover and retrieve plain-text versions of our documentation. This file follows an emerging web convention that optimizes content delivery for Large Language Models.
In addition to the primary index, Juspay offers document-specific llms.txt files. These sub-indexes allow AI systems to access structured information for specific documentation sections more efficiently. You can find the links to all available document-specific LLM files conveniently listed within the root llms.txt file.
- Have questions?
- Need help? Contact support
- LLM? Read llms.txt

