使用AI浏览艾克索拉文档

AI assistants can help you integrate Xsolla faster by accessing documentation optimized for large language models (LLMs).

Use the built-in AI assistant

Xsolla documentation includes a built-in AI assistant that can search documentation and answer questions about Xsolla solutions and products in any language. This is the easiest way to get AI-powered help.

How to access

You can open the AI assistant in any of these ways:

  • Click the Ask AI button in the documentation interface.
  • Press Cmd+K (Mac) or Ctrl+K (Windows, Linux).
  • Click the search bar.

Access plain-text documentation

Xsolla documentation is available in markdown format for AI consumption. The plain-text format reduces token usage and provides clean content for LLMs.

Get markdown files

Every documentation page is available as a markdown file. To access the markdown version of any page, append .md to the URL: https://developers.xsolla.com/get-started/ai-assistants.md. AI assistants can fetch these pages directly, or you can copy and paste the plain-text content into any LLM.

Use llms.txt

Xsolla provides an /llms.txt file available at https://developers.xsolla.com/llms.txt. This follows the llms.txt standard and provides a map of Xsolla documentation that AI assistants can use to navigate content efficiently.

Use Model Context Protocol (MCP)

For advanced integration, Xsolla provides an MCP server that allows AI coding assistants to interact with Xsolla’s documentation directly. This is ideal for developers using tools like Claude Code, Cursor, or VS Code with Copilot.

The MCP server is available at https://xsolla.mcp.kapa.ai.

注:

Authentication: The MCP server uses Google OAuth for authentication. This only requests the openid scope and collects no personal information — just an anonymous ID token to enforce rate limits and prevent abuse.

Rate limits: 40 requests per user per hour, 200 requests per user per day.

Available tools

The MCP server provides the search_xsolla_knowledge_sources tool. It performs semantic search over Xsolla documentation and returns relevant content chunks with source URLs and markdown formatting.

Set up

The examples below provide quick reference for manually configuring common AI coding assistants.

Claude Code

Run this command to add Xsolla’s MCP server:

Copy
Full screen
Small screen
1claude mcp add --transport http xsolla-docs https://xsolla.mcp.kapa.ai

Cursor

Install in Cursor or manually add the following lines to your ~/.cursor/mcp.json file:

Copy
Full screen
Small screen
1{
2  "mcpServers": {
3    "xsolla-docs": {
4      "url": "https://xsolla.mcp.kapa.ai"
5    }
6  }
7}

VS Code with Copilot

Install in VS Code or manually add the following lines to your VS Code settings (.vscode/settings.json or user settings):

Copy
Full screen
Small screen
1{
2  "github.copilot.chat.mcp.servers": {
3    "xsolla-docs": {
4      "url": "https://xsolla.mcp.kapa.ai"
5    }
6  }
7}

Claude Desktop

Xsolla provides a remote MCP server, which is available to Claude Pro, Max, Team, and Enterprise plan users. Remote MCP servers must be added through the Claude Desktop UI as custom connectors.

For Pro and Max users:

  1. Go to the Settings > Connectors section.
  2. Add the server URL: https://xsolla.mcp.kapa.ai.
  3. Authenticate with Google when prompted.
  4. Enable the Xsolla connector.

For Team and Enterprise users:

Ask your Primary Owner or Owner to configure the connector at the organization level. Once configured, you can connect and enable it in your Settings > Connectors.

For detailed setup instructions, see Getting started with custom connectors using remote MCP.

本文对您的有帮助吗?
谢谢!
我们还有其他可改进之处吗? 留言
非常抱歉
请说明为何本文没有帮助到您。 留言
感谢您的反馈!
我们会查看您的留言并运用它改进用户体验。
上次更新时间: 2026年3月31日

发现了错别字或其他内容错误? 请选择文本,然后按Ctrl+Enter。

报告问题
我们非常重视内容质量。您的反馈将帮助我们做得更好。
请留下邮箱以便我们后续跟进
感谢您的反馈!
无法发送您的反馈
请稍后重试或发送邮件至doc_feedback@xsolla.com与我们联系。