Skip to content
Gun.io
giving a lecture
May 19, 2025 · 5 min read

Wrapping an Existing API with MCP: How to Expose Your Current APIs to LLMs

Open Spaces is Gun.io’s field guide to the tools our engineers actually ship with. In this second installment—building on Part 1: Model Context Protocol (MCP)—The Missing Layer Between AI and Your Apps—veteran full-stack architect Tim Kleier shows how you can unlock AI muscle without touching your underlying stack.

Have a plain-vanilla REST endpoint—say, one that files support tickets? Strap on MCP and suddenly Claude, ChatGPT, or any LLM agent can create, update, and triage tickets through simple prompts. Tim walks through wrapping a Node-based ticket service, exposing an /mcp endpoint, and watching a live LLM open a ticket for “Mark in Milwaukee” in real time. Code snippets are inline; the full repo lives at github.com/upgrade-solutions/mcp-api-wrapper.

If you’ve been waiting for a practical way to let AI act on your data—not just talk about it—start here.

Our Scenario

Let’s imagine we work for a SaaS company with a custom system for creating support tickets. We want to hook it up to Claude so our customer support team can interact with the API through prompts. This is our end goal:

We give Claude a prompt and it employs the create_ticket MCP tool, which connects to our support API to create a support ticket for Mark. Before we dive into the code to see how the MCP tool is exposed alongside our regular API, let’s first talk about MCP tools. 

What Are MCP Tools?

MCP defines a standard way for large language models (LLMs) to interact with external systems via tools. For a deeper understanding of MCP’s architecture and capabilities, see our complete introduction to Model Context Protocol. An MCP tool is a description of an action a model can take. Each MCP tool has these basic characteristics:

  • Name (e.g. “create_ticket”)
  • Description (e.g. “creates a new support ticket”)
  • Input Schema (e.g. issue_type, customer_email, customer_name)

Rather than creating a separate MCP server just for tools, you can expose your MCP tool definitions at a dedicated endpoint within your existing API. This keeps things simple and lets LLMs discover the capabilities dynamically. 

Here’s the MCP definition of our tool (create_ticket) in a Node context:

// A create_ticket server tool exposed at /mcp
server.tool(
  'create_ticket',
  'Creates a new support ticket',
  {
    customer_email: z.string().describe('Customer email address'),
    customer_name: z.string().describe('Customer name'),
    issue_message: z.string().describe('Customer message'),
    issue_type: z.enum(['bug', 'feature_request', 'other']).describe('Type of issue'),
  },
  async ({ customer_email, customer_name, issue_message, issue_type }) => {
    const ticket = {
      customer_email,
      customer_name,
      issue_message,
      issue_type
    };
    await createTicket(ticket);
    return {
      content: [{ type: "text", text: Ticket created for ${customer_name} (${customer_email}). }],
    };
  }
);

We first provide a name and description for the tool, then define the input schema. Finally, we define the function for processing a tool request. This is the corresponding API endpoint using Express:

// Create ticket through POST request
app.post('/tickets', async (req: Request, res: Response) => {
const { customer_email, customer_name, issue_message, issue_type } = req.body;
const ticket = await createTicket({
  customer_email,
  customer_name,
  issue_message,
  issue_type
});
res.status(201).json({
  message: 'Ticket created successfully',
  data: ticket
});
});

The key point of overlap here is createTicket(). Our API endpoint calls that function to create a support ticket, and our MCP does exactly the same. 

MCP Endpoint

Now that we’ve seen the side-by-side comparison of our ticket creation functionality in an API endpoint and an MCP tool, let’s take a look at how the MCP “server” is really just exposed as an endpoint in our API.

// Expose the MCP server over HTTP
app.post('/mcp', async (req: Request, res: Response) => {
const transport: StreamableHTTPServerTransport = new StreamableHTTPServerTransport({
  sessionIdGenerator: undefined,
});
await server.connect(transport);
await transport.handleRequest(req, res, req.body);
});

In our tool declaration above, we added a tool to the server object, and here we expose that server over HTTP (via a streamable HTTP transport) to the outside world. 

This /mcp endpoint provides a list of available tools for MCP hosts/clients to interact with. Using a tool like MCP Inspector, we can see that the list of tools includes create_ticket, and we also see the parameters it takes on the right hand side. 

Calling an MCP-wrapped API

Now that we’ve verified our MCP endpoint is working, we can point Claude to the running API. 

{
"mcpServers": {
  "support-api": {
    "command": "npx",
    "args": [
      "mcp-remote",
      "http://localhost:3000/mcp"
    ]
  }
}
}

After restarting Claude, the support-api MCP tool for creating tickets will now be available, and we can easily prompt Claude to help us create a support ticket.

Final Thoughts

As we’ve discussed in this article, you can hook up LLMs to your existing API without much additional code. MCP gives you a thin abstraction layer that lets LLMs safely and intelligently interact with your systems—starting with the APIs you already have.

Whether you’re improving customer support, automating IT workflows, or streamlining internal operations, wrapping your API with MCP can empower your workforce with new AI tooling. To see complete workflow automation in action, check out how we built autonomous business workflows that handle entire customer support tickets from start to finish.

What’s Next

Now that you’ve seen how to wrap existing APIs, explore more advanced MCP implementations:

Ready to Integrate AI with Your Existing Systems?

Wrapping APIs with MCP is just the beginning. Building production-ready AI integrations requires expertise in system architecture, API design, and LLM orchestration. Gun.io connects you with senior engineers who specialize in AI implementations and can help you unlock your data for intelligent automation.

Find AI Integration Specialists | Join Our Engineering Network

Gun.io

Sign up for our newsletter to keep in touch!

This field is for validation purposes and should be left unchanged.

© 2025 Gun.io