Kontent.ai MCP server
The Kontent.ai MCP Server connects your Kontent.ai projects with AI-powered tools (like Claude, Cursor, or VS Code) using the Model Context Protocol (MCP). It allows AI to understand and work with your project’s content model and perform operations via natural language.
Understand the MCP server
The Model Context Protocol (MCP) is an open standard that standardizes how applications provide context to large language models (LLMs). Think of MCP as a universal connector, allowing AI models to interface with various tools and data sources through a standardized protocol. With the Kontent.ai MCP server- Empower AI assistants to read your Kontent.ai content model structure and create new content types, snippets, and taxonomies
- Accelerate content model creation by sending diagrams or descriptions that the AI interprets and implements
- Integrate with multiple AI clients through a standard JSON configuration format
How to run the MCP server
Prerequisites
Before setting up the MCP server, ensure you have:- Latest Node.js
installed
- A Kontent.ai Management API key with the Manage content model permission
- Your Kontent.ai environment ID
Setup options
You can run the Kontent.ai MCP Servernpx
, choosing between STDIO or SSE transports.
STDIO transport: Ideal for direct integration with AI tools that support STDIO communication.
npx @kontent-ai/mcp-server@latest stdio
PORT
environment variable.
npx @kontent-ai/mcp-server@latest sse
Configuration
The server requires the following environment variables:Variable | Description | Required |
KONTENT_API_KEY | Your Kontent.ai Management API key | ✅ |
KONTENT_ENVIRONMENT_ID | Your environment ID | ✅ |
PORT | Port for SSE transport (defaults to 3001) | ❌ |
.env
file.
Transport configuration
STDIO transport
To run the server with the STDIO transport, configure your MCP client with the following JSON:{
"kontent-ai-stdio": {
"command": "npx",
"args": ["@kontent-ai/mcp-server@latest", "stdio"],
"env": {
"KONTENT_API_KEY": "<your-management-api-key>",
"KONTENT_ENVIRONMENT_ID": "<your-environment-id>"
}
}
}
SSE transport
To run the server with the SSE transport, follow these steps: 1. Start the server.npx @kontent-ai/mcp-server@latest sse
KONTENT_API_KEY=<management-api-key>
KONTENT_ENVIRONMENT_ID=<environment-id>
PORT=3001 # Optional; defaults to 3001
{
"kontent-ai-sse": {
"url": "http://localhost:3001/sse"
}
}
Supported operations
The Kontent.ai MCP server exposes a set of tools that represent specific operations your AI assistant or application can perform on your Kontent.ai project. These operations align with your content modeling workflows, enabling AI to query, create, and manage:- Content types and their elements
- Content type snippets
- Taxonomy groups and terms
- Languages
Common use cases
The examples below show how the MCP server tools empower AI assistants to provide meaningful, context-aware interactions, helping teams design, scale, and maintain content models efficiently.Understand and extend a content model
These actions help AI assistants reason about the structure of the project and suggest or create new types based on user intent.- “What content types are already defined in this environment?”
- “Add a new ‘Event‘ content type with elements for name, date, location, and description.”
Group elements for reuse with snippets
Content type snippets are ideal for maintaining consistency across content types while avoiding duplication. Assistants can define snippets once and insert them wherever needed, ensuring that updates are propagated automatically.- “Create an ‘SEO metadata’ content type snippet with title, description, and image elements.”
Organize content with taxonomies
Taxonomies provide structure for categorizing and filtering content. With the MCP Server, AI assistants can create and manage taxonomy groups and terms programmatically.- “Add a taxonomy group called ‘Industries’ with terms for Healthcare, Finance, and Education.”
- “List all taxonomy groups defined in this environment.”
Explore language configuration
Understanding language settings helps the assistant make localization-aware modeling decisions, adapt to localized workflows, and offer content model recommendations based on supported languages.- “What languages are available in this environment?”