MCP (Model Context Protocol) is the open standard from Anthropic for connecting AI agents to external tools and data sources through a typed, introspectable interface.
MCP is a protocol. An MCP server exposes tools, resources, and prompts. An MCP client (Claude, Cursor, Codex, ChatGPT Dev Mode, Gemini CLI, Windsurf, VS Code, Zed) connects to the server, discovers what's available, and calls it on behalf of the user.
Tools are functions the agent can call (send email, query database, create issue). Resources are read-only content (account state, schema, memory). Prompts are pre-built prompt templates the server offers.
Before MCP, connecting an AI agent to an external tool required either browser automation (fragile) or custom API wrappers per client. MCP standardises the connection: write the server once, every MCP-compatible client can use it.
It also replaced the messy 'function calling' handshake each client used to invent separately. MCP is a single wire format.
For vendors: build your MCP server once, ship to every major AI client at the same time. For users: one connector per tool, works everywhere.
stdio: process-to-process, for local development and for clients that prefer subprocess management (Claude Code with local servers).
HTTP (remote): HTTP POST with streaming for remote-hosted servers. This is the preferred production path - OAuth auth, no local process, works across clients.
Most serious vendors (nitrosend included) ship a remote HTTP MCP with OAuth.
nitrosend ships a 21-tool remote HTTP MCP server at `https://api.nitrosend.com/mcp`. Tools cover the full email marketing surface (transactional, campaigns, flows, contacts, segments, domains, deliverability, billing).
OAuth handles auth - no API keys in your shell history. Every tool is idempotent. Resources expose account state, schema, and brand memory so the agent can read context before writing.
Full reference: email MCP server. Install guide per AI client: integrations hub.
No. Anthropic created the spec, but MCP is open and adopted by OpenAI (ChatGPT Dev Mode, Codex), Google (Gemini CLI), Cursor, Windsurf, VS Code (Copilot), Zed, and most other major AI coding and agent tools.
Function calling is the in-model primitive the client uses to invoke tools. MCP is a protocol on top of it that makes a tool discoverable across clients, transport-agnostic, and versioned. Think of function calling as HTTP and MCP as REST.
You need both. REST API for developers writing code; MCP for agents operating your product. They cover different users.
Current adoption curve (Anthropic + OpenAI + Google + every major IDE agent) makes that unlikely in the near term. If a successor ships, it'll almost certainly support MCP adapters for backward compatibility.