Sentry MCP Server

Official Sentry server for searching issues, events, and errors — includes AI-powered fix recommendations via Sentry's Seer integration.

⚠️ Conditional Powerful for teams already using Sentry, but the embedded LLM requirement for AI features is an unusual dependency that adds cost and complexity. Great if you're on Sentry paid plans, skip if you just want error search.
Language: TypeScript Transport: stdio, streamable-http License: Apache-2.0 Stars: 11,891 Updated: Feb 2026 Setup: Moderate

Client Compatibility

Claude Desktop
Cursor
Claude Code
VS Code
Windsurf
Cline

What It Does

Connects Claude to your Sentry error tracking instance. Search issues and events, get detailed error information, create and update issues, and — with the Seer integration — get AI-generated fix recommendations for your bugs. The server works with both Sentry’s hosted platform and self-hosted instances.

The standout feature is Seer, Sentry’s AI analysis engine. When enabled, it provides root cause analysis and suggested fixes for errors. But there’s a significant catch in how it’s wired up (see below).

What It Does Well

  • AI-powered fix recommendations via Seer are genuinely useful. When Seer is enabled, Claude doesn’t just show you the error — it gets Sentry’s analysis of the root cause and suggested code changes. For teams triaging dozens of issues, having the fix suggestion ready in the MCP response saves real time.
  • Remote OAuth server is the cleanest setup path. The hosted version at mcp.sentry.dev/mcp handles auth via OAuth in your browser. No tokens in config files, no environment variables to manage. Just connect and authorize.
  • Self-hosted Sentry support is a real differentiator. The --host flag lets you point the server at any Sentry instance, not just sentry.io. For enterprise teams running self-hosted Sentry, this is the only official MCP option. You can also disable specific skills (like Seer) with --disable-skills=seer.

What It Doesn’t Do Well

  • The AI search features require a separate LLM API key. This is the big gotcha. Sentry’s MCP server itself calls an LLM for AI-powered search and analysis. You need to set EMBEDDED_AGENT_PROVIDER (openai or anthropic) plus the corresponding API key. So Claude is talking to an MCP server that’s talking to another LLM. The cost and complexity of this nested dependency is unusual and not immediately obvious from the README.
  • Token consumption is effectively doubled for AI features. When Claude queries Sentry with AI features enabled, the MCP server makes its own LLM calls behind the scenes. You’re paying for Claude’s tokens AND the embedded agent’s tokens. For teams watching API costs, this adds up quickly with no visibility into the inner LLM’s usage.
  • Issue management tools are basic compared to dedicated project management MCPs. Create and update issue operations exist but are straightforward CRUD. If you need sophisticated workflow automation (assigning, triaging, bulk operations), you’ll need supplementary tooling.

Setup Notes

Two paths: remote OAuth or local with token. Remote is recommended for Sentry Cloud users — npx -y mcp-remote https://mcp.sentry.dev/mcp handles everything through browser-based OAuth.

For local setup, generate an auth token at sentry.io/settings/auth-tokens. If you want the AI features, you’ll also need an OpenAI or Anthropic API key configured separately. This is the part that surprises people — the MCP server has its own LLM dependency.

Self-hosted users: add --host https://your-sentry.example.com to the args. Skills can be individually disabled if you don’t want Seer or other AI features.

Config

Remote (recommended for Sentry Cloud):

{
  "mcpServers": {
    "sentry": {
      "command": "npx",
      "args": ["-y", "mcp-remote", "https://mcp.sentry.dev/mcp"]
    }
  }
}

Local (with AI features):

{
  "mcpServers": {
    "sentry": {
      "command": "npx",
      "args": ["-y", "@sentry/mcp-server"],
      "env": {
        "SENTRY_ACCESS_TOKEN": "sntrys_your_token_here",
        "EMBEDDED_AGENT_PROVIDER": "anthropic",
        "ANTHROPIC_API_KEY": "sk-ant-your_key_here"
      }
    }
  }
}

Local (without AI features):

{
  "mcpServers": {
    "sentry": {
      "command": "npx",
      "args": ["-y", "@sentry/mcp-server", "--disable-skills=seer"],
      "env": {
        "SENTRY_ACCESS_TOKEN": "sntrys_your_token_here"
      }
    }
  }
}

Tested With

  • Claude Desktop on Windows 11
Config — paste into your client
{
  "mcpServers": {
    "sentry-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "@sentry/mcp-server"
      ],
      "env": {
        "SENTRY_ACCESS_TOKEN": "your-token"
      }
    }
  }
}

Environment Variables

  • SENTRY_ACCESS_TOKEN (required) — Auth token from sentry.io/settings/auth-tokens/
  • EMBEDDED_AGENT_PROVIDER — LLM provider for AI search features (openai or anthropic) — optional but needed for Seer
  • OPENAI_API_KEY — Required if EMBEDDED_AGENT_PROVIDER is openai
  • ANTHROPIC_API_KEY — Required if EMBEDDED_AGENT_PROVIDER is anthropic

Prerequisites

  • Node.js 18+
  • Sentry account with auth token
  • Optional: separate LLM API key for AI-powered search features

View on GitHub · npx -y @sentry/mcp-server

Reviewed by J-Dub · February 22, 2026

Related Servers