• Cheat sheets
  • Documentation
  • API reference
  • Product updates
  • Sign in
Kontent.ai Learn
  • Try Kontent.ai
  • Plan
  • Set up
  • Model
  • Develop
  • Create
Copyright © 2025 Kontent.ai. All rights reserved.
  • Web
  • Privacy policy
  • Cookies policy
  • Consent settings
  • Security
  • GDPR
  • Docs
    • AI accelerators
    • AI models, services, and learning
    • MCP server
  • Collections
  • Custom apps
  • Custom elements
  • Environments
  • Keyboard shortcuts
  • Mission Control
  • Projects
  • Task management
  • Terminology

Kontent.ai MCP server

Martina Farkasova
5 minutes
AI
Download PDF
The Kontent.ai MCP Server connects your Kontent.ai projects with AI-powered tools (like Claude, Cursor, or VS Code) using the Model Context Protocol (MCP). It allows AI to understand and work with your project’s contents and perform operations via natural language.

Understand the MCP server

The Model Context Protocol (MCP) is an open standard that standardizes how applications provide context to large language models (LLMs). Think of MCP as a universal connector, allowing AI models to interface with various tools and data sources through a standardized protocol. With the Kontent.ai MCP server, you can:
  • Empower AI assistants to read your Kontent.ai content model structure and create new content types, snippets, and taxonomies
  • Use natural language to query, create, and update content items and their language variants
  • Move content items through the workflow steps, publish, and unpublish content as needed
  • Accelerate content model creation by sending diagrams or descriptions that the AI interprets and implements
  • Integrate with multiple AI clients through a standard JSON configuration format
For more information on the protocol specification, visit modelcontextprotocol.io.

How to run the MCP server

Prerequisites

Before setting up the MCP server, ensure you have:
  • Latest Node.js installed
  • A Kontent.ai Management API key with the Manage content model permission
  • Your Kontent.ai environment ID

Setup options

You can run the Kontent.ai MCP Server using npx, choosing between STDIO or Streamable HTTP (shttp) transports.
STDIO transport: Ideal for direct integration with AI tools that support STDIO communication.
  • Shell
npx @kontent-ai/mcp-server@latest stdio
Streamable HTTP transport: Suitable for scenarios requiring a long-lived, streamable HTTP connection. The default port is 3001, but you can configure the port by setting the PORT environment variable.
  • Shell
npx @kontent-ai/mcp-server@latest shttp

Configuration

The server requires the following environment variables (for STDIO and single-tenant HTTP modes):
VariableDescriptionRequired
KONTENT_API_KEYYour Kontent.ai Management API key✅
KONTENT_ENVIRONMENT_IDYour environment ID✅
PORTPort for HTTP transport (defaults to 3001)❌
Set these variables in your environment or a .env file. In multi-tenant HTTP mode, no environment variables are required.

Transport configuration

STDIO transport

To run the server with the STDIO transport, configure your MCP client with the following JSON:
  • JSON

Streamable HTTP transport (shttp)

When using the HTTP transport, you can choose between single-tenant mode (simpler setup with environment variables) and multi-tenant mode (one server handling multiple environments). Single-tenant mode: 1. Start the server.
  • Shell
2. Set environment variables.
  • Shell
3. Configure your MCP client.
  • JSON
Multi-tenant mode: The server can run without environment variables and accept requests for multiple environments using URL path parameters and Bearer authentication. For more details, see the multi-tenant mode documentation on GitHub.

Supported operations

The Kontent.ai MCP server exposes a set of tools that represent specific operations your AI assistant or application can perform on your Kontent.ai project. These operations align with your content modeling, items, and asset processes, enabling AI to query, create, and manage:
  • Content types and their elements
  • Content type snippets
  • Taxonomy groups and terms
  • Content items
  • Language variants
  • Assets
  • Languages
  • Workflows
You can find the full set of available operations in the MCP Server GitHub repository.

Common use cases

The examples below show how the MCP server tools empower AI assistants to provide meaningful, context-aware interactions, helping teams design, scale, and maintain content efficiently.

Understand and extend a content model

These actions help AI assistants reason about the structure of the project and suggest or create new types based on user intent.
  • “What content types are already defined in this environment?”
  • “Add a new ‘Event‘ content type with elements for name, date, location, and description.”

Group elements for reuse with snippets

Content type snippets are ideal for maintaining consistency across content types while avoiding duplication. Assistants can define snippets once and insert them wherever needed, ensuring that updates are propagated automatically.
  • “Create an ‘SEO metadata’ content type snippet with title, description, and image elements.”

Organize content with taxonomies

Taxonomies provide structure for categorizing and filtering content. With the MCP Server, AI assistants can create and manage taxonomy groups and terms programmatically.
  • “Add a taxonomy group called ‘Industries’ with terms for Healthcare, Finance, and Education.”
  • “List all taxonomy groups defined in this environment.”

Manage content items

Draft, update, and delete content using AI. The AI assistants scan what elements your items contain, and the languages you have in your project. They can then generate content based on a content brief or instructions you provide, and create the content item with its language variants.
  • “Create a blog post called ‘Treatments after heart surgery’ based on this content brief in these languages: English, Spanish, and French.”
  • “Update the title and summary of the English variant of the item with this ID {ID_of_the_item}.”
  • “Delete the Spanish variant of the content item ‘Mid-century modern décor at home’.”

Explore language configuration

Understanding language settings helps the assistant make localization-aware modeling decisions, adapt to localized workflows, and offer content model recommendations based on supported languages.
  • “What languages are available in this environment?”

{
  "kontent-ai-stdio": {
    "command": "npx",
    "args": ["@kontent-ai/mcp-server@latest", "stdio"],
    "env": {
      "KONTENT_API_KEY": "<your-management-api-key>",
      "KONTENT_ENVIRONMENT_ID": "<your-environment-id>"
    }
  }
}
npx @kontent-ai/mcp-server@latest shttp
KONTENT_API_KEY=<management-api-key>
KONTENT_ENVIRONMENT_ID=<environment-id>
PORT=3001  # Optional; defaults to 3001
{
  "kontent-ai-http": {
    "url": "http://localhost:3001/mcp"
  }
}
SSE transport is supported in the MCP server up to version 0.16.0. Starting from version 0.17.0, SSE transport has been removed and is no longer available.
  • Understand the MCP server
  • How to run the MCP server
  • Configuration
  • Supported operations
  • Common use cases