DeepCore MCP
MCP THE FUTURE
DeepCore MCP Documentation
What is MCP
MCP (Model Context Protocol) is an open standard for building secure bi-directional connections, allowing AI models to interact with data sources. Through MCP, developers can:
Expose their data to AI models via MCP servers
Build AI applications (MCP clients) that connect to these servers
Enable AI models to maintain context across different tools and datasets
MCP addresses the challenges of AI systems accessing data, replacing traditional fragmented integration approaches with a universal open standard.
MCP Implementation in DeepCore
DeepCore implements complete MCP server functionality, allowing users to package their tools and data as MCP services for AI models to use. In DeepCore, MCP services are divided into two types:
Built-in MCP services: System-predefined services such as
coin-api
MCP Tool Importing: Support for importing tools from external MCP services into the DeepCore tool marketplace, enabling easy integration of third-party capabilities
Dynamic MCP services: Custom MCP services created by users through the API
Core Components
DeepCore's MCP implementation contains the following core components:
MCP Server: Provides the ability to interact with AI models through SSE (Server-Sent Events)
MCP Tools: Functional units encapsulated in the MCP server, typically representing API calls
MCP Prompt Templates: Prompt templates that help AI better use MCP tools
MCP Resources: Static content that MCP services can access
MCP Server Architecture
DeepCore's MCP implementation uses SSE as the transport layer, based on the following architecture:
AI model/client initiates SSE connection to the MCP server endpoint
Client can request tool lists, call tools, get prompts, access resources, etc.
MCP server processes requests and returns responses via SSE stream
All interactions use standardized MCP formats
Token Authentication
DeepCore uses API Token authentication for accessing all MCP-related APIs. This method is simple to implement and suitable for both development and production environments.
Prerequisites
Registered DeepCore account and obtained API token
Created or accessed certain tools (via the
/api/tools
endpoint)
Importing MCP Service Tools
DeepCore provides API interfaces that allow users to import tools from existing MCP services. This makes it easy to integrate functionality from third-party MCP services into your own DeepCore tool set for use in custom AI applications.
Process for Importing MCP Tools
Parse MCP service URL: Use the
/api/tools/parse-mcp
interface to analyze an MCP service and obtain its tool definitionsBatch create tools: Use the
/api/tools/create-batch
interface to batch import the parsed tools into your account
Example: Parsing MCP Services
The following example shows how to parse an existing MCP service and get its tool definitions:
Example: Batch Importing Tools
After obtaining tool definitions by parsing an MCP service, you can use the following code to batch import these tools into your account:
Benefits of Importing MCP Tools
Functionality Reuse: No need to redevelop existing MCP tool functionality
Service Integration: Combine tools from multiple MCP services into one custom MCP service
Feature Customization: Selectively import needed tools, reject unwanted ones
Permission Control: Apply your own access control policies to imported tools
Better Integration: Seamlessly integrate third-party MCP service tools with your own developed tools
Creating MCP Services
Steps to Create an MCP Service
Select tools: Determine which tools to expose in the MCP service
Create MCP server: Call the
/api/mcp/create
API to create an MCP serverAdd prompt templates (optional): Add prompt templates via
/api/mcp/{mcp_name}/prompts
Add resources (optional): Add resources via
/api/mcp/{mcp_name}/resources
Getting Tool IDs
When creating an MCP service, you need to provide a list of tool IDs to include. These IDs must be existing tool IDs in the DeepCore platform. Here's how to get available tool IDs:
Example: Creating an MCP Service
Here's a complete example showing how to create an MCP service using an API Token:
Example: Adding Prompt Templates
Example: Adding Resources
MCP Server Internal Implementation
DeepCore's MCP server internal implementation is based on the Python MCP server library and integrated with FastAPI. Main components include:
SseServerTransport: Provides an SSE-based transport layer
Server: MCP server core class, handles requests and responses
Handlers: Registered handler functions, including:
list_tools
- List available toolscall_tool
- Call toolslist_prompts
- List prompt templatesget_prompt
- Get specific promptslist_resources
- List resourcesread_resource
- Read resource content
Core Implementation
DeepCore's MCP implementation encapsulates tools into MCP tool format, automatically converting parameter structures and response formats to ensure compliance with MCP specifications. Each MCP service is stored in the database, including basic service information, associated tools, prompts, and resources.
Dynamic MCP service creation process:
Create
MCPServer
record in the databaseAssociate
MCPTool
records pointing to existing toolsOptionally add
MCPPrompt
andMCPResource
recordsProcess requests through routing, dynamically creating MCP server instances
Using MCP Services
Client Connection
AI models or clients can connect to MCP server endpoints via SSE:
Available Operations
Once connected, clients can perform the following operations:
List tools: Get list of available tools
Call tools: Execute tool operations and get results
List prompts: Get list of available prompt templates
Get prompts: Get specific prompt templates and fill variables
List resources: Get list of available resources
Read resources: Get resource content
Example: AI Model Using MCP Service
When an AI model (like Claude) connects to an MCP service, it can:
Using Python Client to Connect to MCP Services
DeepCore MCP services can be connected to and used through various client libraries. Below is how to use the official mcp
library and mirascope
library to connect to DeepCore-provided MCP services.
Installing Dependencies
First, install the necessary dependency packages:
Example of Using Mirascope to Connect to MCP Services
Here's a basic example showing how to use the mirascope library to connect to a DeepCore MCP service and list available tools:
Example: Calling MCP Tools
The following example shows how to call tools in an MCP service:
Example: Listing and Using Prompt Templates
Example: Reading Resources
Multilingual SDK Support
Model Context Protocol provides SDKs in multiple programming languages that can seamlessly integrate with DeepCore MCP services. Below are the usage methods for various language SDKs, allowing developers to choose their familiar language to use MCP services.
TypeScript/JavaScript SDK
TypeScript SDK is one of the official implementations of Model Context Protocol, suitable for Node.js and browser environments.
Installation
Usage Example
Java SDK
Java SDK provides functionality for using MCP in Java applications, with good integration with Spring AI.
Adding Dependencies
Or
Usage Example
Advantages of MCP Services
Using DeepCore MCP services offers the following advantages:
Standardized Interface: Use a unified protocol to connect different data sources
Secure Access: Controlled data access methods
Context Preservation: AI models can maintain context across different tools
Simplified Integration: No need to maintain separate connectors for each data source
Scalability: Easily add new tools and data sources
Practical Application Scenarios
1. Knowledge Base Access
Create MCP services connected to enterprise knowledge bases, allowing AI models to query and retrieve specific information.
2. Data Analysis Tools
Encapsulate data analysis functionality as MCP tools, enabling AI models to execute data queries and analysis.
3. Internal System Integration
Package internal system APIs as MCP services, giving AI assistants access to these functionalities.
4. Specialized Domain Tools
Create collections of specialized domain tools (such as finance, healthcare) and provide them to AI models through MCP.
Best Practices
Tool Design: Design simple, single-function tools rather than complex multi-function tools
Provide Clear Descriptions: Give clear descriptions for tools and parameters
Add Prompt Templates: Help AI models better understand and use tools
Resource Management: Provide commonly used information as resources to reduce repeated queries
Permission Control: Pay attention to access permission control for MCP services
Limitations and Considerations
MCP services currently only support synchronous calls, not long-running asynchronous operations
There are size limitations on data returned by tools; avoid returning overly large datasets
Be careful to protect sensitive information and authentication credentials
MCP server instances are created with each request, they don't maintain long-term state
Troubleshooting
Common issues and solutions:
Connection Problems: Check if the MCP service name is correct, confirm the service has been created
Authentication Errors: Verify user permissions and authentication tokens
Tool Call Failures: Check tool configuration and parameter formats
Resource Access Failures: Confirm resources have been added and are in the correct format
Summary
DeepCore MCP services provide a standardized way for AI models to securely and effectively access and operate various tools and data sources. Through simple API calls, developers can create powerful MCP services, extending the capability range of AI models and building smarter, more practical AI applications.
Last updated