View all blogs
5 min read

MCP servers are changing the game for AI tools

By AnonymousApril 20, 2025
MCP servers are changing the game for AI tools
AU
Anonymous
Buildcamp Founder

If you’ve heard of MCP servers and wondered what the hype is about—you’re in the right place.

Model Context Protocol (MCP) servers are changing the game for AI tools. They let AI assistants actually do things—not just talk. From querying your Supabase database to making payments with Stripe or managing pull requests on GitHub, MCP servers connect AI models to real-world services.

This beginner-friendly guide explains what MCP servers are, how they work, and how to get started with them inside AI-powered editors like Cursor and Windsurf.


In Plain English: What’s an MCP Server?

MCP stands for Model Context Protocol. It’s a standard that lets AI assistants (like those in Cursor or Windsurf) plug into external tools and services.

Without MCP, an AI might give you code or advice—but it can’t take action.

With MCP, it can:

  • Talk to your database
  • Create GitHub issues or review pull requests
  • Interact with Stripe to handle payments
  • And a whole lot more

Think of MCP servers as the missing link between text-generation AI and actual software automation.


How MCP Servers Work (The Simple Version)

There are three parts to the system:

  1. MCP Host: This is your AI assistant (like in Cursor or Windsurf)
  2. MCP Client: The thing inside the host that makes requests to tools
  3. MCP Server: External tools or APIs you want the AI to use

MCP servers expose:

  • Resources (like REST GET endpoints) for reading data
  • Tools (like POST, PUT, DELETE) to perform actions

Why Use MCP? Here’s What You Get:

Standardized format — so one tool can work across many hosts
Modularity — plug in new tools without changing your AI
Interoperability — one server can work with many LLMs or agents
Security — sandboxed tools, safer actions

And most importantly: AI assistants become way more useful.


MCP vs. Traditional APIs

Traditional APIs:

  • Are often monolithic
  • Can be hard to scale
  • Mix read/write logic in complex ways

MCP:

  • Uses microservices
  • Separates read-only resources from tools
  • Lets you scale and manage services independently
  • Makes fault isolation and security easier
ComparisonTraditional APIMCP
ArchitectureMonolithicMicroservices
FlexibilityLess flexiblePlug-and-play modular
ScalingScale entire systemScale individual services
Tool AccessUsually limitedPurpose-built and exposed

🧠 GitHub MCP

  • Read repos, issues, pull requests
  • Comment, commit, and review code
  • Use AI to manage dev workflows

Prompt example:

"Review the latest PR and flag anything unusual."

🧩 Supabase MCP

  • Manage your database via natural language
  • Read/write records, modify schemas, run SQL

Prompt example:

"Show me all users with an @gmail.com email."

💳 Stripe MCP

  • Create payments, subscriptions, invoices
  • Manage customers and plans

Prompt example:

"Create a new $9.99/month plan called 'Pro Plan'."


How to Use MCP Servers in Cursor or Windsurf

🔧 In Cursor

  1. Open Command Palette → Search for "MCP settings"
  2. Enable MCP
  3. Add MCP servers manually or via Composio (managed service)

🌊 In Windsurf

  1. Go to Settings → MCP
  2. Enable MCP (on by default for pro users)
  3. Add servers (manually or from catalog)

Supports two types of transport:

  • stdio (local server)
  • /sse (remote via HTTPS)

Once added, ask the AI something like:

"Update the Supabase schema to add a last_login timestamp column."


Real Examples

✅ GitHub MCP

"Review PR #456 and leave comments for unclear variable names."

  • AI authenticates with GitHub
  • Loads the PR
  • Leaves inline suggestions

✅ Supabase MCP

"Find users who haven't logged in since Jan 1."

  • AI creates a SQL query
  • Runs it
  • Returns the data in chat

✅ Stripe MCP

"Show me all failed payments from the last 30 days."

  • AI queries failed charges
  • Returns a formatted summary

Bonus: Cross-Service Workflows

This is where it gets cool. You can chain actions:

  • Create a GitHub issue
  • Link it to a feature branch
  • Update your Supabase schema
  • Push to Vercel

All from one prompt. ✨


Best Practices

  • ✅ Use limited-scope tokens
  • ✅ Rotate keys regularly
  • ✅ Avoid using MCP for sensitive data edits (review before applying)
  • ✅ Run local servers for high-trust environments

When to Use MCP

✅ Managing your dev workflow via chat
✅ Writing + running SQL without opening a DB client
✅ Creating and managing SaaS billing flows
✅ Automating boring ops tasks (without building full dashboards)

Skip it when:

  • You’re making irreversible changes
  • You need granular approval flows
  • The AI isn’t confident in the request

Final Thoughts

MCP servers give AI real power: not just to generate, but to execute.

It’s a huge step forward for devs, founders, and power users who want to work faster—and smarter.

Try it in Cursor, Windsurf, or build your own MCP server for any tool or service you love.

You’re not just talking to AI anymore. You’re working with it.


Resources

Share this article: