Chat with Your Zabbix: A Practical Guide to Integrating AI with Zabbix AI MCP Server
Unlocking Zabbix with AI: A Look at the Zabbix AI MCP Server
Good morning everyone, Dimitri Bellini here, back on my channel Quadrata! Today, we’re diving into something truly interesting, a bit experimental, and as always, involving our good friend Zabbix. This exploration comes thanks to a member of the Italian Zabbix community, Matteo Peirone, who reached out on LinkedIn to share a fascinating project he developed. I was immediately intrigued and knew I had to show it to you.
So, what are we talking about? It’s called the Zabbix AI MCP Server, and it allows us to instrument operations within Zabbix using artificial intelligence. Let’s break down what this means and how it works.
What is the Zabbix AI MCP Server?
At its core, the Zabbix AI MCP Server acts as an intermediary, bridging the gap between artificial intelligence and the Zabbix server’s APIs. Many of you might already be familiar with Zabbix APIs, which allow us to consult data or perform actions within our Zabbix environment. This project aims to simplify these interactions significantly, especially for those not deeply versed in API scripting.
To get started, we need a few key components:
- An inference engine: This can be cloud-based or local (via Ollama or VLLM). I’ve been experimenting with a few.
- An adequate AI model compatible with the engine.
- The Zabbix AI MCP Server itself.
- A small, yet crucial, project called mcp-to-openapi-proxy.
- In my setup, I’m using Open Web UI as a chat interface, similar to ChatGPT, to interact with the AI.
Understanding MCP: Model Context Protocol
Before we go further, it’s important to understand what “MCP” stands for. It means Model Context Protocol. This protocol, invented by Anthropic (the creators of Claude), is designed to allow AI models to interact with external “tools.” These tools can be anything from platform functionalities to specific software features.
Essentially, MCP provides a standardized way for an AI to:
- Discover available tools and their capabilities (e.g., functions, resources).
- Understand how to use these tools, including descriptions and invocation methods.
This is particularly relevant for AI agents, which are sophisticated prompts instructed to perform tasks that might require external interactions, like research or system operations. MCP helps standardize these tool interactions, which can be a challenge as not all LLM models handle function calls equally well.
How the Zabbix AI MCP Server Works
The Zabbix AI MCP Server, developed by Matteo Peirone, leverages this MCP framework. It exposes Zabbix’s API functionalities as “tools” that an AI can understand and use. This means you can:
- Consult data: Ask for the latest problems, analyze triggers, or get details about a host.
- Perform actions: Create or update objects within Zabbix (if not in read-only mode).
All of this can be done without needing to write complex API scripts yourself!
The Architecture in My Setup:
Here’s how the pieces connect in my demonstration:
- Open Web UI: This is my chat interface where I type my requests in natural language.
- mcp-to-openapi-proxy: This acts as a bridge. Open Web UI is instructed to look for tools here. This proxy converts MCP functions into REST API endpoints (normalized in the OpenAPI standard) that Open Web UI can consume. It essentially acts as an MCP client.
- Zabbix AI MCP Server: This is the star of the show. The mcp-to-openapi-proxy communicates with this server. The Zabbix AI MCP Server is configured with the details of my Zabbix instance (URL, authentication token or credentials).
- Zabbix Server: The Zabbix AI MCP Server then interacts with the actual Zabbix server APIs to fetch data or perform actions based on the AI’s request.
Getting Started: Installation and Setup Guide
Here’s a brief rundown of how I got this up and running. It might seem a bit involved, but following these steps should make it manageable:
-
Clone the Zabbix AI MCP Server Repository:
git clone https://github.com/mpeirone/zabbix-mcp-server.git(You’ll find the repository on Matteo Peirone’s GitHub)
-
Navigate into the directory and install dependencies:
cd zabbix-mcp-server
uv sync(I’m using
uvhere, which is a fast Python package installer and resolver). -
Configure the Zabbix AI MCP Server:
Copy the example configuration file:cp config/.env.example .envThen, edit the
.envfile to include your Zabbix server URL, authentication method (token or user/password), and setREAD_ONLY=falseif you want to test creation/update functionalities (use with caution!).ZABBIX_URL="http://your-zabbix-server/api_jsonrpc.php"
ZABBIX_TOKEN="your_zabbix_api_token"
# or
# ZABBIX_USER="your_user"
# ZABBIX_PASSWORD="your_password"
READ_ONLY=false -
Install and Run the mcp-to-openapi-proxy:
This component exposes the MCP server over HTTP.pipx install uv
uvx mcpo --port 8000 --api-key "top-secret" -- uv run python3.11 src/zabbix_mcp_server.pyThis command will typically start the proxy on port 8000, and it will, in turn, launch your Zabbix MCP server application. It will also generate an API token (e.g., “topsecret”) that you’ll need for Open Web UI.
-
Set up Open Web UI:
Deploy Open Web UI (e.g., via Docker). I’ve configured mine to connect to a local Ollama instance, but you can also point it to other LLM providers. -
Configure Tools in Open Web UI:
- In Open Web UI, navigate to the Admin Panel -> Settings -> Connections to set up your LLM connection (e.g., Ollama, OpenAI, OpenRouter).
- Then, go to Tools and add a new tool server:
- URL: Point it to where your `mcp-to-openapi-proxy` is running (e.g., `http://my_server_ip:8000/`).
- Authentication: Use “Bearer Token” and provide the token generated by `mcp-to-openapi-proxy` (e.g., “topsecret”).
- Give it a name (e.g., “Zabbix MCP Proxy”) and ensure it’s enabled.
Putting It to the Test: Demo Highlights
In my video, I demonstrated a few queries:
- “Give me the latest five Zabbix problems in a nice table.”
Using a local Mistral model via VLLM, the system successfully called the Zabbix MCP Server and retrieved the problem data, formatting it into a table. The accuracy was good, matching the problems shown in my Zabbix dashboard. - Fetching Host Details:
I asked, “Give me the details of the host called Zabbix server.” Initially, with the local model, the phrasing needed to be precise. Switching to a more powerful model like Gemini Pro (via OpenRouter) and specifying “hostname equal to Zabbix server” yielded the correct host ID and details. This highlights how the LLM’s capability plays a role in interpreting requests and using the tools.
One challenge observed is that sometimes, for more complex information (like correlating event IDs with host names not directly in the initial problem get API call), the AI might need more sophisticated tool definitions or better prompting to make multiple, related API calls. However, the beauty of MCP is that you could potentially create a custom “tool” within the Zabbix MCP Server that performs these multiple queries internally and presents a consolidated result.
The Potential and Why This is Exciting
This approach is incredibly versatile. For those not comfortable with APIs, it’s a game-changer. But even for seasoned users, it opens up possibilities for quick queries and potentially for building more complex AI-driven automation around Zabbix.
The Zabbix AI MCP Server is an experiment, something new and fresh. It’s a fantastic starting point that can be refined and improved, perhaps with your help and ideas! The fact that it’s built on an open standard like MCP means it could integrate with a growing ecosystem of AI tools and agents.
Join the Conversation!
This is just the beginning. It’s fascinating to think about how we can use methodologies like the MCP server not just within Zabbix, but across many other applications. The ability to automate and interact with complex systems using natural language is a powerful concept.
What do you think about this? Can you see yourself using something like this? What other use cases come to mind? Let me know in the comments below – I’m really keen to hear your thoughts and start a discussion on this topic.
If you found this interesting, please give the video a big thumbs up, and if you haven’t already, subscribe to Quadrata for more explorations into the world of open source and IT!
That’s all for today. See you next week!
A big thank you again to Matteo Peirone for this innovative project!
Connect with me and the community:
- My YouTube Channel: Quadrata (https://www.youtube.com/@quadrata)
- ZabbixItalia Telegram Channel: (https://t.me/zabbixitalia) – Feel free to reach out to Matteo or discuss this project there!
