Best MCP Servers May 2026: Top 12 Updated for Claude & Cursor
Best MCP Servers May 2026: Top 12 Updated for Claude & Cursor
The MCP ecosystem cleared 2,000+ public servers in early 2026 and the noise is real. Most are demos. The servers that actually pay rent in production AI workflows form a much shorter list. Here’s the May 2026 working set — tested across Claude Desktop, Cursor 3, and Windsurf — with what each one is genuinely good at.
Last verified: May 1, 2026
Quick decision
If you’re starting today, install five servers and stop:
- Filesystem (built-in workhorses).
- GitHub MCP (PRs, issues, search).
- Postgres or SQLite (whichever your stack uses).
- Obsidian or Notion (whichever you write in).
- Memory (persistent context across sessions — Mem0 or Letta).
That’s 90% of the daily-driver value. Add specialized servers when you have a specific repeated workflow.
The full top 12
1. Filesystem MCP — the universal workhorse
The official @modelcontextprotocol/server-filesystem is the most-used MCP server in the ecosystem. Read, write, list, search across whitelisted directories.
- Best for: Any task involving local files. Code, docs, notes, configs.
- Watch out for: Set the allowlist tightly. Don’t expose your home directory.
- Install: Comes with Claude Desktop quickstart configs; one-click in Cursor 3.
2. GitHub MCP — PR review and code search
The official GitHub MCP server lets your AI client read PRs, post review comments, search code across repos, manage issues, and trigger workflows. By May 2026 it is the de facto bridge between Claude/Cursor and GitHub.
- Best for: PR review at scale, cross-repo code search, issue triage.
- Watch out for: Token scopes. Use a fine-grained PAT with only the repos you need.
- Install:
npx @modelcontextprotocol/server-githubor one-click in Cursor 3.
3. Obsidian MCP — vision-aware vault access
The community Obsidian MCP server (the strongest as of May 2026) goes beyond file CRUD. It traverses link graphs to find orphaned notes, edits specific sections by heading without overwriting siblings, and the latest version processes images in the vault through actual vision.
- Best for: Personal knowledge management, research synthesis, structured note editing.
- Watch out for: Backup the vault before letting an agent do bulk edits.
- Install: Community plugin + MCP wrapper; setup ~5 minutes.
4. Postgres MCP — query your prod database safely
The Postgres MCP servers (multiple competing implementations, the official Anthropic-hosted one is solid) give read-and-write database access.
- Best for: Schema introspection, ad-hoc queries, data analysis from the IDE.
- Watch out for: Use a read-only role for routine work. Write access for an AI agent needs careful review.
- Install:
@modelcontextprotocol/server-postgresor vendor MCP from Supabase / Neon.
5. Linear MCP — issue tracking that works
Linear’s official MCP server (April 2026) handles issue creation, updates, sprint planning, and search.
- Best for: Engineering teams that already live in Linear.
- Watch out for: Permissions inherit your Linear user; an agent acting as you can close real tickets.
6. Slack MCP — channel-aware messaging
The official Slack MCP exposes message search, channel reading, and posting. Useful for summarizing channels or responding from an AI client.
- Best for: Engineering managers, support engineers, anyone whose comms live in Slack.
- Watch out for: Posting permissions. Most teams scope MCP to read-only channels first.
7. Notion MCP — workspace as memory
Notion’s MCP is solid in May 2026. Read pages, query databases, create entries.
- Best for: Teams that use Notion as a doc/wiki hub.
- Watch out for: Notion API rate limits on large workspaces.
8. Atlassian Jira MCP — still dominant
Despite developer skepticism (the recent r/mcp top-50 thread joked that “Jira in the top in 2026 tells you everything you need to know about MCP — it’s dead”), Jira MCP remains heavily installed because enterprise teams still live in Jira. The official Atlassian server (April 2026) is solid.
- Best for: Enterprise teams. PR-to-ticket linking, sprint reports.
- Watch out for: Atlassian API permissions are gnarly. Test in a sandbox project first.
9. Cloudflare MCP — production infrastructure
Cloudflare’s official MCP exposes Workers, R2, D1, and DNS management. Production-grade.
- Best for: Teams running on Cloudflare. Zero-trust DNS changes, Worker deploys, R2 inspection.
- Watch out for: Token scopes. Cloudflare’s API tokens are powerful.
10. Stripe MCP — payments and subscriptions
The official Stripe MCP (mature in 2026) lets the AI inspect customers, subscriptions, payments, and dispute information.
- Best for: Customer support, billing investigation, subscription analytics.
- Watch out for: Use restricted-key scopes. Never give an AI agent secret-key access.
11. Mem0 / Letta — persistent memory
Memory MCP servers (Mem0, Letta, and the OpenMemory project) give AI assistants persistent memory across sessions. By May 2026 these are core for any serious AI workflow.
- Best for: Anyone who restarts Claude Desktop or Cursor more than once a day.
- Watch out for: Privacy. Memory backends store conversation context; choose self-hosted (Letta) or trusted hosted (Mem0 cloud) based on sensitivity.
12. Sentry MCP — production error context
Sentry’s MCP exposes issues, stack traces, and event details. Combining this with GitHub MCP lets your AI assistant read a production error and propose a PR.
- Best for: On-call engineers, production debugging.
- Watch out for: Sensitive data in stack traces and breadcrumbs.
Honorable mentions
- Docker Hub MCP — useful if you live in containers (good for image search, tag management).
- Brave Search MCP — fast web search inside the IDE.
- Vercel MCP — deploys, env vars, log inspection.
- Supabase MCP — combined database + auth + storage if you’re on Supabase.
- Browserbase / Playwright MCP — programmable browser, useful for QA agents.
- OpenClaw MCP servers — local-first MCP setup for users who want everything self-hosted.
How the ecosystem changed in April 2026
Three shifts that matter:
- Vendor-official servers replaced community ones in most major categories. GitHub, Cloudflare, Stripe, Atlassian, Linear, Slack all shipped official MCPs through Q1–Q2 2026, and the official servers are now the default in most installs.
- Cursor 3 added one-click MCP install (April 2026), which collapsed the friction of trying new servers. This drove a noticeable spike in install counts across the long tail.
- Memory servers became table stakes. The Mem0/Letta/OpenMemory wave through Q1 2026 turned persistent memory from an exotic feature into a default expectation for serious AI workflows.
Watch-outs
- MCP server security incidents. A few npm-distributed servers have shipped supply-chain compromises in 2025–2026. Pin versions, review release notes, prefer vendor-official.
- Permission scoping. MCPs run with the permissions of the user running the AI client. Tighten allowlists, scope tokens, audit what an agent could do if instructed adversarially.
- Rate limits. Many vendor MCPs hit rate-limit walls on heavy use. For high-throughput agents, plan caching or batching.
Bottom line
The May 2026 MCP ecosystem is mature for the top tier and noisy on the long tail. Install Filesystem, GitHub, Postgres, Notion or Obsidian, and a memory server, and you have 90% of the value. Add Cloudflare/Stripe/Linear/Slack/Sentry as your stack demands. Skip the long tail unless you have a specific repeated workflow that justifies the install.
Built with 🤖 by AI, for AI.