The Kontent.ai MCP Server connects your Kontent.ai projects with AI-powered tools (like Claude, Cursor, or VS Code) using the Model Context Protocol (MCP). It allows AI to understand and work with your project’s contents and perform operations via natural language.
The Model Context Protocol (MCP) is an open standard that standardizes how applications provide context to large language models (LLMs). Think of MCP as a universal connector, allowing AI models to interface with various tools and data sources through a standardized protocol.With the Kontent.ai MCP server, you can:
Empower AI assistants to read your Kontent.ai content model structure and create new content types, snippets, and taxonomies
Use natural language to query, create, and update content items and their language variants
Move content items through the workflow steps, publish, and unpublish content as needed
Accelerate content model creation by sending diagrams or descriptions that the AI interprets and implements
Integrate with multiple AI clients through a standard JSON configuration format
For more information on the protocol specification, visit modelcontextprotocol.io.Before setting up the MCP server, ensure you have:
You can run the Kontent.ai MCP Server using npx, choosing between STDIO or Streamable HTTP (shttp) transports.
STDIO transport: Ideal for direct integration with AI tools that support STDIO communication.
Streamable HTTP transport: Suitable for scenarios requiring a long-lived, streamable HTTP connection. The default port is 3001, but you can configure the port by setting the PORT environment variable.
The server requires the following environment variables (for STDIO and single-tenant HTTP modes):
Variable
Description
Required
KONTENT_API_KEY
Your Kontent.ai Management API key
✅
KONTENT_ENVIRONMENT_ID
Your environment ID
✅
PORT
Port for HTTP transport (defaults to 3001)
❌
Set these variables in your environment or a .env file. In multi-tenant HTTP mode, no environment variables are required.
To run the server with the STDIO transport, configure your MCP client with the following JSON:
When using the HTTP transport, you can choose between single-tenant mode (simpler setup with environment variables) and multi-tenant mode (one server handling multiple environments).
Single-tenant mode:
1. Start the server.
2. Set environment variables.
3. Configure your MCP client.
Multi-tenant mode:
The server can run without environment variables and accept requests for multiple environments using URL path parameters and Bearer authentication. For more details, see the multi-tenant mode documentation on GitHub.
The Kontent.ai MCP server exposes a set of tools that represent specific operations your AI assistant or application can perform on your Kontent.ai project. These operations align with your content modeling, items, and asset processes, enabling AI to query, create, and manage:
The examples below show how the MCP server tools empower AI assistants to provide meaningful, context-aware interactions, helping teams design, scale, and maintain content efficiently.
These actions help AI assistants understand the structure of the project and suggest or create new types based on your needs.
“What content types are already defined in this environment?”
“Add a new ‘Event‘ content type with elements for name, date, location, and description.”
“Update the ‘Article’ content type to include a taxonomy element for industries.”
“Delete the ‘Press release’ content type.”
Content type snippets are ideal for maintaining consistency across content types while avoiding duplication. Assistants can define snippets once and insert them wherever needed, ensuring that updates are propagated automatically.
“Create an ‘SEO metadata’ content type snippet with title, description, and image elements.”
“List all content type snippets in this environment.”
Taxonomies provide structure for organizing and filtering content. With the MCP Server, AI assistants can create and update taxonomy groups and terms programmatically.
“Add a taxonomy group called ‘Industries’ with terms for Healthcare, Finance, and Education.”
“List all taxonomy groups defined in this environment.”
Draft, update, and delete content using AI. The AI assistants scan what elements your items contain, and the languages you have in your project. They can then generate content based on a content brief or instructions you provide, and create the content item with its language variants.
“Create a blog post called ‘Treatments after heart surgery’ based on this content brief in these languages: English, Spanish, and French.”
“Update the title and summary of the English variant of the item with this ID {ID_of_the_item}.”
“Delete the Spanish variant of the content item ‘Mid-century modern décor at home’.”
“Search all French variants of articles containing the keyword ‘avocado’.”
“Create a new draft version of the English variant of item {ID_of_the_item}.”
Understanding language settings helps the assistant make localization-aware modeling decisions, adapt to localized workflows, and offer content model recommendations based on supported languages.
“What languages are available in this environment?”
“Create missing Spanish variants for all English blog posts.”
Workflows define the lifecycle of content from draft to review to published. AI assistants can automatically move content items between workflow steps, helping teams enforce editorial processes and keep content production on track.
“Move all draft blog posts to the Review step.”
“Schedule the Spanish variant of item {ID} to publish on 2025-09-01 at 10:00 Madrid time.”
“Unpublish the German variant of item {ID} tomorrow at midnight.”
Assets like images and documents can add structure and enhance your content. AI assistants can fetch or include assets in your content items.
“List all images uploaded in the past week.”
“Get the details of asset {assetId}.”
“Insert the image of the forest trail {assetId} into the Autumn Hiking Tips article.”
Filters help refine what content items you‘re looking for using specific keywords and criteria. You can also search for items based on what you mean in natural language. With the MCP Server, AI assistants can help locate items and language variants based on your queries.
“Find unfinished articles that have ‘cardiology‘ in the name“
“Show recent Spanish blog posts that are assigned to Sofia Peletier in ascending order“
“I’m looking for items about plants that thrive in dry environments.“
SSE transport is supported in the MCP server up to version0.16.0.Starting from version 0.17.0, SSE transport has been removed and is no longer available.