Skip to main content
Local MCP lets you run your own tools on your own infrastructure and have the DeepCurrent AI call them during chat — just like the built-in DeepCurrent tools, but hosted by you. This is useful for connecting private data sources, proprietary APIs, or any tool that cannot leave your environment.

How it works

When you start a chat session, DeepCurrent connects to your MCP server and auto-discovers all the tools it exposes. Those tools are then available to the AI alongside the built-in DeepCurrent tools — no extra configuration needed after the initial connection. Your MCP server must expose a Streamable HTTP endpoint. DeepCurrent connects to it using the Model Context Protocol over Streamable HTTP transport.
DeepCurrent forwards your user Authorization header from the chat session to your MCP server on every tool call, so your server can verify the caller’s identity.

Use cases

Private data sources

Connect internal databases or proprietary datasets that cannot leave your infrastructure.

Proprietary APIs

Wrap any internal or partner API as a callable tool in DeepCurrent chat.

Custom workflows

Build multi-step automations tailored to your team’s specific processes.

Prototype and test

Try out new tool ideas locally before investing in a full integration.

Connect your MCP server

1

Implement your MCP server

Build a server that implements the Model Context Protocol and exposes a Streamable HTTP endpoint. Use any MCP-compatible SDK or framework you prefer.Your server must be reachable from the DeepCurrent chat runtime. The endpoint URL typically ends in /mcp — for example, http://your-server:8001/mcp.
2

Expose a Streamable HTTP endpoint

Ensure your server accepts POST requests on its /mcp path and responds according to the MCP Streamable HTTP transport spec. DeepCurrent will call this endpoint to discover your tools and invoke them during chat.
3

Connect in DeepCurrent settings

In the DeepCurrent dashboard, go to Settings → Integrations → Local MCP and enter your server’s endpoint URL. Save the setting — DeepCurrent will connect automatically at the start of your next chat session and list your tools alongside the built-in ones.

Multi-step tool calling

DeepCurrent’s AI can make up to 10 tool call steps per conversation turn. This means it can chain your local tools together in a single message — fetching data, transforming it, and taking action — without any extra work on your part.
Ask the AI “What tools do you have access to?” at the start of a session to confirm your local tools were discovered successfully.

Authentication

DeepCurrent forwards the Authorization header from the chat request to your MCP server on every tool invocation. Your server receives the same Bearer <token> the user’s browser sent, so you can validate the caller’s identity and enforce access control within your tools.
Your MCP server is responsible for its own access control and rate limiting. DeepCurrent does not validate or restrict calls to your server beyond the 10-step-per-turn limit.