Allows LLMS to more easily integrate with external tools
It establishes a standard interface for APIs to communicate to LLMs. This way all types of APIs and their individual patterns can be integrated into an LLM app without too much hassle.
Without MCP
You have build integration for each application and tool
{mcptools} - Allows MCP-enabled tools like Claude Desktop, Claude Code, and VS Code GitHub Copilot can run R code in the sessions you have running to answer your questions
Works well with {btw}
{mcpr} - Enables R applications to expose capabilities (tools, resources, and prompts) to AI models through a standard JSON-RPC 2.0 interface. It also provides client functionality to connect to and interact with MCP servers
{fastapi_mcp} - Exposes FastAPI endpoints as Model Context Protocol (MCP) tools with Auth
Using MCP servers rather than the cloud service CLI tools (e.g. BigQuery CLI) provides better security control over what LLM products (e.g. Claude Code) can access, especially for handling sensitive data that requires logging or has potential privacy concerns.
Protocol
MCP Client: Various clients and apps you use, like Cursor, know how to talk using the “MCP Protocol.”
MCP Servers: Providers like Sentry, Slack, JIRA, Gmail, etc. set up adapters around their APIs that follow the MCP Protocol.
They convert a message like “Get me the recent messages in the #alerts channel” to a request that can be sent to the Slack API
Server File(s) Components
Defines a set of tools for the MCP Server to implement
Creates the server to listen for incoming requests
Defines a `switch` case that receives the tool call and calls the underlying external API, like the Slack API.
Wire up transport between Client and Server
Allows the MCP client and server to communicate more simply than standard HTTP requests. Instead, they can read from standard IO when communicating.