Model Context Protocol (MCP) Server

This page explains how to connect natural language clients to KDB.AI using the Model Context Protocol (MCP), and introduces the KDB.AI MCP Server.

Overview

The MCP Server integration allows KDB.AI to work with any client that supports MCP. Applications like Claude, GPT, or Gemini can send natural language requests from either users or AI models, and interact directly with KDB.AI.

KDB.AI can return structured, context-aware responses using its intelligent prompts, curated tools, and production-grade resources—all exposed through the MCP Server.

Designed for compatibility with the growing MCP ecosystem, the integration follows the MCP standard defined by Anthropic. Its modular design supports product-specific APIs and custom logic, making it easy for technical teams to extend functionality.

The KDB.AI MCP Server is built on a flexible framework with:

  • Curated prompts and tools for queries, table exploration, and LLM guidance

  • Configurable templates for adapting behavior to different products or use cases

  • Built-in protections that block destructive operations

  • Auto-discovery of tools, prompts, and resources—no manual setup required

How it works

The end user enters a natural language request through an MCP client application. The LLM embedded in or connected to the client interprets the request and selects the appropriate tools, prompts, or resources exposed by the MCP Server.

The client then sends the selected operation to the MCP Server, which routes the request to KDB.AI and returns a structured result. This interaction pattern supports common use cases like enterprise AI assistants, database explorers, and conversational data tools.

Compatibility

Compatible with KDB.AI 1.7.0 and later.

Get started

Configuration and usage instructions are available in the MCP Server README file.