What are MCP Servers and Why People are Crazy About It?

Everyone is going gaga over using MCP servers for their AI workflow. But exactly what it is and why it matters?
Warp Terminal

It took me way longer than I’d like to admit to wrap my head around MCP servers.

At first glance, they sound like just another protocol in the never-ending parade of tech buzzwords decorated alongside AI.

But trust me, once you understand what they are, you start to see why people are obsessed with them.

This post isn’t meant to be the ultimate deep dive (I’ll link to some great resources for that at the end). Instead, consider it just a lil introduction or a starter on MCP servers.

And no, I’m not going to explain MCP using USB-C as a metaphor, if you get that joke, congrats, you’ve clearly been Googling around like the rest of us. If not… well, give it time. 😛

mcp architecture explained with the help of usb type c as example
Source: Norah Sakal's Blog

What even is an MCP Server?

MCP stands for Model Context Protocol, an open standard introduced by Anthropic in November 2024.

Its purpose is to improve how AI models interact with external systems, not by modifying the models themselves, but by providing them structured, secure access to real-world data, tools, and services.

An MCP server is a standalone service that exposes specific capabilities such as reading files, querying databases, invoking APIs, or offering reusable prompts, in a standardized format that AI models can understand.

Rather than building custom integrations for every individual data source or tool, developers can implement MCP servers that conform to a shared protocol.

This eliminates the need for repetitive boilerplate and reduces complexity in AI applications.

What can an MCP Server actually do?

Source: X

Quite a bit. Depending on how they’re set up, MCP servers can expose:

  • Resources – Stuff like files, documents, or database queries that an AI can read.
  • Tools – Actions like sending an email, creating a GitHub issue, or checking the weather.
  • Prompts – Predefined instructions or templates that guide AI behavior in repeatable ways.

Each of these is exposed through a JSON-RPC 2.0 interface, meaning AI clients can query what's available, call the appropriate function, and get clean, structured responses.https://www.anthropic.com/

So... how does an MCP server actually work?

MCP servers follow a well-defined architecture intended to standardize how AI models access external tools, data, and services.

MCP client-server architecture | Source: modelcontextprotocol.io

Each part of the system has a clear role, contributing to a modular and scalable environment for AI integration.

  • Host Applications
    These are the environments where AI agents operate, such as coding assistants, desktop apps, or conversational UIs.

    They don’t interact with external systems directly, but instead rely on MCP clients to broker those connections.
  • MCP Clients
    The client is responsible for managing the connection between the AI agent and the MCP server. It handles protocol-level tasks like capability discovery, permissions, and communication state.

    Clients maintain direct, persistent connections to the server, ensuring requests and responses are handled correctly.
  • MCP Servers
    The server exposes defined capabilities such as reading files, executing functions, or retrieving documents using the Model Context Protocol.

    Each server is configured to present these capabilities in a standardized format that AI models can interpret without needing custom integration logic.
  • Underlying Data or Tooling
    This includes everything the server is connected to: file systems, databases, external APIs, or internal services.

    The server mediates access, applying permission controls, formatting responses, and exposing only what the client is authorized to use.

This separation of roles between the model host, client, server, and data source, allows AI applications to scale and interoperate cleanly.

Developers can focus on defining useful capabilities inside a server, knowing that any MCP-compatible client can access them predictably and securely.

Wait, so how are MCP Servers different from APIs?

Fair question. It might sound like MCP is just a fancy wrapper around regular APIs, but there are key differences:

FeatureTraditional APIMCP Server
PurposeGeneral software communicationFeed AI models with data, tools, or prompts
InteractionRequires manual integration and parsingPresents info in model-friendly format
StandardizationVaries wildly per serviceUnified protocol (MCP)
SecurityMust be implemented case-by-caseBuilt-in controls and isolation
Use CaseBackend services, apps, etc.Enhancing AI agents like Claude or Copilot or Cursor

Basically, APIs were made for apps. MCP servers were made for AI.

Want to spin up your own self-hosted MCP Server?

While building a custom MCP server from scratch is entirely possible, you don’t have to start there.

There’s already a growing list of open-source MCP servers you can clone, deploy, and start testing with your preferred AI assistant like Claude, Cursor, or others.

mcpservers.org website for open source mcp servers
mcpservers.org is an amazing website to find open-source MCP Servers

If you're interested in writing your own server or extending an existing one, stay tuned. We’re covering that in a dedicated upcoming post, we'll walk through the process step by step in an upcoming post, using the official Python SDK.

Make sure you’re following or better yet, subscribe, so you don’t miss it.

Want to learn more on MCP?

Here are a few great places to start:

I personally found this a good introduction to MCP Servers

  1. How I Finally Understood MCP — and Got It Working in Real Life - towards data science
  2. What are MCP Servers And Why It Changes Everything - Huggingface

Conclusion

And there you have it, a foundational understanding of what MCP servers are, what they can do, and why they’re quickly becoming a cornerstone in the evolving landscape of AI.

We’ve only just scratched the surface, but hopefully, this introduction has demystified some of the initial complexities and highlighted the immense potential these servers hold for building more robust, secure, and integrated AI applications.

Stay tuned for our next deep dive, where we’ll try and build an MCP server and a client from scratch with the Python SDK. Because really, the best way to learn is to get your hands dirty.

Until then, happy hacking. 🧛

About the author
Abhishek Kumar

Abhishek Kumar

I'm definitely not a nerd, perhaps a geek who likes to tinker around with whatever tech I get my hands on. Figuring things out on my own gives me joy. BTW, I don't use Arch.

Become a Better Linux User

With the FOSS Weekly Newsletter, you learn useful Linux tips, discover applications, explore new distros and stay updated with the latest from Linux world

itsfoss happy penguin

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to It's FOSS.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.