Webinar - May 21Building Regulated Infrastructure: How Lucis Standardized Security for Global Care
Interfaces/MCP Server

Let AI agents
manage your infrastructure.

The Qovery MCP Server exposes Qovery's full API surface to AI coding agents via the Model Context Protocol. Claude, Cursor, OpenCode, and any MCP-compatible tool can deploy, scale, and debug your services by conversation.

Key features

What you get.

01

Standard MCP protocol

Implements the open Model Context Protocol. Any MCP-compatible AI agent can connect -- no custom integration code required.

02

Full API coverage

Every Qovery action is available: deploy, scale, clone environments, manage variables, tail logs, check status. The agent has the same power as the CLI.

03

Works with popular agents

Tested with Claude Code, Cursor, OpenCode, Codex, and Gemini CLI. Install the Qovery MCP server and start deploying by conversation.

04

Secure by design

Runs locally. Uses your existing Qovery API token. No credentials leave your machine. RBAC is enforced server-side.

Use cases

What you can do.

Conversational deploys

"Deploy the backend to staging" -- the agent resolves the service, picks the latest commit, triggers the deployment, and reports back.

AI-assisted debugging

"Why is the payment service failing?" -- the agent checks deployment status, tails logs, inspects environment variables, and suggests fixes.

Environment management

"Clone production to a new QA environment and scale it down" -- the agent handles the full workflow in one conversation.

Getting started

Connect your AI agent

  1. 01Install the Qovery CLI (includes the MCP server)
  2. 02Authenticate: qovery auth
  3. 03Start the MCP server: qovery mcp start
  4. 04Configure your AI agent to connect to the local MCP endpoint
$ brew install qovery-cli
$ qovery auth
$ qovery mcp start

# In your AI agent config:
# MCP endpoint: stdio://qovery-mcp

Deploy by
conversation.

Connect your AI coding agent to Qovery. Deploy, debug, and manage infrastructure without leaving your editor.