Building a Standalone MCP Server: Bridging Legacy and Modern Systems with AI
Open Spaces is Gun.io’s field guide to the tools our engineers actually ship with. In this third installment, veteran full-stack architect Tim Kleier shows how you can unlock AI muscle without touching your underlying stack. This builds on Part 1: Model Context Protocol (MCP)—The Missing Layer Between AI and Your Apps, and Part 2: Wrapping an Existing API with MCP: How to Expose Your Current APIs to LLMs.
What if AI could seamlessly query both modern APIs and legacy databases—without your team rebuilding a thing?
That’s exactly what the Model Context Protocol (MCP) enables. In a previous article, Wrapping an Existing API with MCP, we connected to a customer support API. In this guide, we’ll build a standalone MCP server that exposes a legacy database to large language models (LLMs).
We’ll continue to use our customer support scenario to walk through how MCP servers work. For an intro to MCP, check out Model Context Procotol: The Missing Layer Between AI and your Apps.
The Use Case: Investigating Old Tickets with AI
Two years ago, your company switched to a new customer support platform. The new system offers a clean REST API. But the old system? It’s just a dusty Postgres database.
Now imagine a B2B customer submits a ticket:
“Hey, why is our payment $499/month now? We agreed to $350/month when we signed on three years ago.”
Support reps would normally have to contact a developer to query the legacy database. But with an MCP server in place, an LLM can do it for them.
Why Use a Standalone MCP Server?
While MCP tools can be embedded in an existing app (as we showed here), some organizations prefer to keep AI tooling separate:
- Modularity – A dedicated AI layer avoids polluting core app code.
- Cross-system Orchestration – A standalone MCP server can wrap multiple systems into a unified toolset.
- Scale – You can run the MCP server as a scalable service behind an agent gateway.
Standalone MCP servers let you build flexible, language-agnostic interfaces on top of your existing systems. To see how multiple MCP tools can be orchestrated into complete workflows, check out our guide to building autonomous business workflows with MCP.
Let’s see one in action.
Step 1: Set Up the MCP Server
Creating an MCP server is pretty simple, and you can check out the official docs for a quickstart. In order to connect to our legacy support system, we’ll use a PostgreSQL connector with read-only queries.
Below are some of the key code snippets, and the full source code can be found here: https://github.com/upgrade-solutions/mcp-server-postgres
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import pg from "../node_modules/@types/pg";
// set up the server
const server = new Server(
{
name: "customer-support-database",
description: "MCP server for connecting to customer support PostgreSQL database",
version: "0.1.0",
},
{
capabilities: {
resources: {},
tools: {},
},
},
);
// declare the "query" tool
server.setRequestHandler(ListToolsRequestSchema, async () => {
return {
tools: [
{
name: "query",
description: "Run a read-only SQL query",
inputSchema: {
type: "object",
properties: {
sql: { type: "string" },
},
},
},
],
};
});
// set up PostgreSQL pool connection
const pool = new pg.Pool({
connectionString: databaseUrl,
});
// execute SQL query
const client = await pool.connect();
await client.query("BEGIN TRANSACTION READ ONLY");
const result = await client.query(sql);
// start server
async function runServer() {
const transport = new StdioServerTransport();
await server.connect(transport);
}
As you can see, the code primarily involves basic MCP server setup, a PostgreSQL connection, registering the query tool, and then query execution.
In order to use the postgres MCP server with Claude, you’ll need to configure your claude_desktop_config.json file.
| { “mcpServers”: { “customer-support-database”: { “command”: “npx”, “args”: [ “-y”, “@modelcontextprotocol/server-postgres”, “postgres://<host>/<table>/schema” ] } } } |
Step 2: Use It from an LLM Agent
Now that the MCP server is ready and Claude has registered it, we can find that old ticket about grandfathered pricing.
Not only was Claude able to find the record, it did so when given a slightly incorrect email address. This is because it can execute multiple read-only queries until it finds the data we’re looking for. Incredibly powerful!
Closing Thoughts
As AI agents grow more capable, they need structured access to real data—not just chat history. MCP offers the missing abstraction for that.
Standalone MCP servers let you build flexible, language-agnostic interfaces on top of your existing systems. Ready to see how these tools come together? Our final installment shows how to orchestrate multiple MCP tools into autonomous business workflows. Whether it’s a brand-new REST API or a crusty legacy database, the model doesn’t care—as long as it has tools to work with.
And that’s the future with MCP: a seamless bridge between systems old and new.
What’s Next: Complete Workflow Automation
You’ve learned the building blocks—now see them in action. In our final installment, Creating Business Workflows with LLMs and MCP, we orchestrate multiple MCP tools (like the ones covered in this series) into autonomous agents that handle complete business processes from start to finish.
The Complete MCP Series
Part 1: Model Context Protocol Introduction – Understanding MCP fundamentals
Part 2: Wrapping Existing APIs – Connect modern REST endpoints
Part 3: Building Standalone MCP Servers – Connect legacy systems (you are here)
Part 4: Autonomous Business Workflows – Complete workflow automation
Ready to Bridge Your Legacy Systems with AI?
Connecting legacy databases and modern AI requires deep expertise in system integration, database optimization, and secure AI architectures. Gun.io connects you with senior engineers who specialize in MCP implementations and can help you unlock the value trapped in your existing systems.