DeepCore MCP
MCP THE FUTURE
DeepCore MCP Documentation
What is MCP
MCP (Model Context Protocol) is an open standard for building secure bi-directional connections, allowing AI models to interact with data sources. Through MCP, developers can:
Expose their data to AI models via MCP servers
Build AI applications (MCP clients) that connect to these servers
Enable AI models to maintain context across different tools and datasets
MCP addresses the challenges of AI systems accessing data, replacing traditional fragmented integration approaches with a universal open standard.
MCP Implementation in DeepCore
DeepCore implements complete MCP server functionality, allowing users to package their tools and data as MCP services for AI models to use. In DeepCore, MCP services are divided into two types:
Built-in MCP services: System-predefined services such as
coin-api
MCP Tool Importing: Support for importing tools from external MCP services into the DeepCore tool marketplace, enabling easy integration of third-party capabilities
Dynamic MCP services: Custom MCP services created by users through the API
Core Components
DeepCore's MCP implementation contains the following core components:
MCP Server: Provides the ability to interact with AI models through SSE (Server-Sent Events)
MCP Tools: Functional units encapsulated in the MCP server, typically representing API calls
MCP Prompt Templates: Prompt templates that help AI better use MCP tools
MCP Resources: Static content that MCP services can access
MCP Server Architecture

DeepCore's MCP implementation uses SSE as the transport layer, based on the following architecture:
AI model/client initiates SSE connection to the MCP server endpoint
Client can request tool lists, call tools, get prompts, access resources, etc.
MCP server processes requests and returns responses via SSE stream
All interactions use standardized MCP formats
Token Authentication
DeepCore uses API Token authentication for accessing all MCP-related APIs. This method is simple to implement and suitable for both development and production environments.
For detailed information on authentication methods, obtaining and using API Tokens, please refer to the DeepCore API Documentation.
import requests
# Use API Token to call interfaces
headers = {"X-API-Token": "tk_your_api_token"} # Replace with your API Token from the DeepCore platform
# Example: Create an MCP service
response = requests.post(
"https://api.deepcore.top/api/mcp/create",
headers=headers,
json={
"mcp_name": "my-service",
"tool_ids": ["00000000-0000-0000-0000-000000000001"],
"description": "My MCP Service"
}
)
print(response.json())
Prerequisites
Registered DeepCore account and obtained API token
Created or accessed certain tools (via the
/api/tools
endpoint)
Importing MCP Service Tools
DeepCore provides API interfaces that allow users to import tools from existing MCP services. This makes it easy to integrate functionality from third-party MCP services into your own DeepCore tool set for use in custom AI applications.
Process for Importing MCP Tools
Parse MCP service URL: Use the
/api/tools/parse-mcp
interface to analyze an MCP service and obtain its tool definitionsBatch create tools: Use the
/api/tools/create-batch
interface to batch import the parsed tools into your account
Example: Parsing MCP Services
The following example shows how to parse an existing MCP service and get its tool definitions:
import requests
# Use API Token for authentication
api_token = "tk_your_api_token" # Replace with your actual API Token
headers = {"X-API-Token": api_token}
# Parse MCP service URL
response = requests.post(
"https://api.deepcore.top/api/tools/parse-mcp",
headers=headers,
json={"mcp_url": "https://api.deepcore.top/mcp/weather-service"}
)
# Get parsing results
parse_result = response.json()["data"]
print(f"Successfully parsed MCP service: {parse_result['url']}")
print(f"Number of tools discovered: {len(parse_result['api_info'])}")
# Display parsed tools
for i, tool in enumerate(parse_result['api_info']):
print(f"Tool {i+1}: {tool['name']} - {tool.get('description', 'No description')}")
Example: Batch Importing Tools
After obtaining tool definitions by parsing an MCP service, you can use the following code to batch import these tools into your account:
import requests
# Use API Token for authentication
api_token = "tk_your_api_token" # Replace with your actual API Token
headers = {"X-API-Token": api_token}
# Assuming parse_result has been obtained (result from previous step)
# Build batch tool creation request
tools_to_create = []
for tool_info in parse_result['api_info']:
# Build import configuration for each tool
tool_config = {
"name": f"MCP Import: {tool_info['name']}",
"description": tool_info.get('description', 'Tool imported from MCP service'),
"origin": parse_result['url'], # MCP service URL as tool source
"path": tool_info.get('path', ''),
"method": tool_info.get('method', 'GET'),
"parameters": tool_info.get('parameters', []),
"auth_config": None, # No extra authentication needed for calls via MCP
"is_mcp_tool": True # Mark as MCP tool
}
tools_to_create.append(tool_config)
# Batch create tools
response = requests.post(
"https://api.deepcore.top/api/tools/create-batch",
headers=headers,
json={"tools": tools_to_create}
)
# Check results
result = response.json()
if result.get("code") == 0:
created_tools = result["data"]
print(f"Successfully imported {len(created_tools)} tools")
for tool in created_tools:
print(f"- Tool ID: {tool['id']}, Name: {tool['name']}")
else:
print(f"Failed to import tools: {result.get('msg', 'Unknown error')}")
Benefits of Importing MCP Tools
Functionality Reuse: No need to redevelop existing MCP tool functionality
Service Integration: Combine tools from multiple MCP services into one custom MCP service
Feature Customization: Selectively import needed tools, reject unwanted ones
Permission Control: Apply your own access control policies to imported tools
Better Integration: Seamlessly integrate third-party MCP service tools with your own developed tools
Creating MCP Services
Steps to Create an MCP Service
Select tools: Determine which tools to expose in the MCP service
Create MCP server: Call the
/api/mcp/create
API to create an MCP serverAdd prompt templates (optional): Add prompt templates via
/api/mcp/{mcp_name}/prompts
Add resources (optional): Add resources via
/api/mcp/{mcp_name}/resources
Getting Tool IDs
When creating an MCP service, you need to provide a list of tool IDs to include. These IDs must be existing tool IDs in the DeepCore platform. Here's how to get available tool IDs:
import requests
# Use API Token for authentication
api_token = "tk_your_api_token" # Replace with your actual API Token
headers = {"X-API-Token": api_token}
# Get list of available tools
response = requests.get(
"https://api.deepcore.top/api/tools/list",
headers=headers,
params={
"include_public": True, # Include public tools
"page": 1, # Page number
"page_size": 50 # Number of tools per page
}
)
# Parse tool list
tools_data = response.json()["data"]
available_tools = tools_data["items"]
# Display available tools and their IDs
print("Available tools list:")
for tool in available_tools:
print(f"ID: {tool['id']} - Name: {tool['name']} - Description: {tool.get('description', 'No description')}")
# Select tool IDs to include in the MCP service
selected_tool_ids = [
"00000000-0000-0000-0000-000000000001", # Example ID, replace with actual tool ID
"00000000-0000-0000-0000-000000000002" # Example ID, replace with actual tool ID
]
Example: Creating an MCP Service
Here's a complete example showing how to create an MCP service using an API Token:
import requests
# Use API Token
api_token = "tk_your_api_token" # Replace with your actual API Token
headers = {"X-API-Token": api_token}
# Step 1: Get list of available tools
response = requests.get(
"https://api.deepcore.top/api/tools/list",
headers=headers,
params={"include_public": True, "page": 1, "page_size": 10}
)
tools_data = response.json()["data"]["items"]
available_tool_ids = [tool["id"] for tool in tools_data]
print(f"Available tool IDs: {available_tool_ids}")
# Step 2: Create MCP service
response = requests.post(
"https://api.deepcore.top/api/mcp/create",
headers=headers,
json={
"mcp_name": "sample-service",
"tool_ids": available_tool_ids[:2], # Use the first two tools
"description": "MCP Service Example Created with API Token"
}
)
mcp_data = response.json()["data"]
print(f"MCP service created: {mcp_data['mcp_name']}")
# Step 3: Add prompt template
response = requests.post(
f"https://api.deepcore.top/api/mcp/{mcp_data['mcp_name']}/prompts",
headers=headers,
json={
"prompt_name": "sample-prompt",
"description": "Example prompt template",
"arguments": [
{"name": "param1", "description": "Parameter 1", "required": True}
],
"template": "Use {{ param1 }} as a parameter"
}
)
print(f"Prompt template added: {response.json()}")
# Step 4: Add resource
response = requests.post(
f"https://api.deepcore.top/api/mcp/{mcp_data['mcp_name']}/resources",
headers=headers,
json={
"resource_uri": "sample.txt",
"content": "This is an example resource content",
"mime_type": "text/plain"
}
)
print(f"Resource added: {response.json()}")
Example: Adding Prompt Templates
response = requests.post(
"https://api.deepcore.top/api/mcp/weather-service/prompts",
headers=headers,
json={
"prompt_name": "check-weather",
"description": "Query weather information for a specified city",
"arguments": [
{"name": "city", "description": "City name", "required": True},
{"name": "date", "description": "Date (optional)", "required": False}
],
"template": "Help me check the weather in {{ city }}{% if date %}, the date is {{ date }}{% endif %}"
}
)
print(response.json())
Example: Adding Resources
response = requests.post(
"https://api.deepcore.top/api/mcp/weather-service/resources",
headers=headers,
json={
"resource_uri": "cities.txt",
"content": "New York\nLosAngeles\nChicago\nPhiladelphia\n...",
"mime_type": "text/plain"
}
)
print(response.json())
MCP Server Internal Implementation
DeepCore's MCP server internal implementation is based on the Python MCP server library and integrated with FastAPI. Main components include:
SseServerTransport: Provides an SSE-based transport layer
Server: MCP server core class, handles requests and responses
Handlers: Registered handler functions, including:
list_tools
- List available toolscall_tool
- Call toolslist_prompts
- List prompt templatesget_prompt
- Get specific promptslist_resources
- List resourcesread_resource
- Read resource content
Core Implementation
DeepCore's MCP implementation encapsulates tools into MCP tool format, automatically converting parameter structures and response formats to ensure compliance with MCP specifications. Each MCP service is stored in the database, including basic service information, associated tools, prompts, and resources.
Dynamic MCP service creation process:
Create
MCPServer
record in the databaseAssociate
MCPTool
records pointing to existing toolsOptionally add
MCPPrompt
andMCPResource
recordsProcess requests through routing, dynamically creating MCP server instances
Using MCP Services
Client Connection
AI models or clients can connect to MCP server endpoints via SSE:
GET https://api.deepcore.top/mcp/{mcp_name}
Available Operations
Once connected, clients can perform the following operations:
List tools: Get list of available tools
Call tools: Execute tool operations and get results
List prompts: Get list of available prompt templates
Get prompts: Get specific prompt templates and fill variables
List resources: Get list of available resources
Read resources: Get resource content
Example: AI Model Using MCP Service
When an AI model (like Claude) connects to an MCP service, it can:
Model: I need to know the weather in New York
MCP Service: [Lists tools, including a weather query tool]
Model: [Calls the weather query tool with parameter city="New York"]
MCP Service: [Returns weather information for New York]
Model: Thank you, this information is very helpful
Using Python Client to Connect to MCP Services
DeepCore MCP services can be connected to and used through various client libraries. Below is how to use the official mcp
library and mirascope
library to connect to DeepCore-provided MCP services.
Installing Dependencies
First, install the necessary dependency packages:
pip install mcp mirascope
Example of Using Mirascope to Connect to MCP Services
Here's a basic example showing how to use the mirascope library to connect to a DeepCore MCP service and list available tools:
import asyncio
from mcp import ListToolsResult
from mirascope.mcp import sse_client
async def main():
# Connect to DeepCore MCP service
async with sse_client("https://api.deepcore.top/mcp/mcpAggService") as client:
# List available tools
tools: ListToolsResult = await client._session.list_tools()
print(tools)
if __name__ == "__main__":
asyncio.run(main())
Example: Calling MCP Tools
The following example shows how to call tools in an MCP service:
import asyncio
from mcp import CallToolResult
from mirascope.mcp import sse_client
async def main():
async with sse_client("https://api.deepcore.top/mcp/weather-service") as client:
# First list all tools
tools = await client._session.list_tools()
print(f"Available tools: {[tool.name for tool in tools.tools]}")
# Call the weather query tool
tool_result: CallToolResult = await client._session.call_tool(
name="query-weather",
arguments={"city": "New York", "date": "today"}
)
# Print tool execution results
for content in tool_result.contents:
if content.type == "text":
print(f"Result: {content.text}")
elif content.type == "image":
print(f"Image URL: {content.url}")
if __name__ == "__main__":
asyncio.run(main())
Example: Listing and Using Prompt Templates
import asyncio
from mirascope.mcp import sse_client
async def main():
async with sse_client("https://api.deepcore.top/mcp/weather-service") as client:
# List all prompt templates
prompts = await client._session.list_prompts()
print(f"Available prompts: {[prompt.name for prompt in prompts.prompts]}")
# Get specific prompt
if prompts.prompts:
prompt_name = prompts.prompts[0].name
prompt = await client._session.get_prompt(
name=prompt_name,
arguments={"city": "Shanghai", "date": "tomorrow"}
)
print(f"Prompt content: {prompt.text}")
if __name__ == "__main__":
asyncio.run(main())
Example: Reading Resources
import asyncio
from mirascope.mcp import sse_client
async def main():
async with sse_client("https://api.deepcore.top/mcp/weather-service") as client:
# List all resources
resources = await client._session.list_resources()
print(f"Available resources: {[resource.uri for resource in resources.resources]}")
# Read specific resource
if resources.resources:
resource_uri = resources.resources[0].uri
resource = await client._session.read_resource(uri=resource_uri)
print(f"Resource content: {resource.content.text[:100]}...") # Only display first 100 characters
if __name__ == "__main__":
asyncio.run(main())
Multilingual SDK Support
Model Context Protocol provides SDKs in multiple programming languages that can seamlessly integrate with DeepCore MCP services. Below are the usage methods for various language SDKs, allowing developers to choose their familiar language to use MCP services.
TypeScript/JavaScript SDK
TypeScript SDK is one of the official implementations of Model Context Protocol, suitable for Node.js and browser environments.
Installation
npm install @modelcontextprotocol/typescript-sdk
Usage Example
import { createClient } from '@modelcontextprotocol/typescript-sdk';
async function main() {
// Create MCP client
const client = await createClient('https://api.deepcore.top/mcp/mcpAggService');
try {
// Get available tools list
const tools = await client.listTools();
console.log('Available tools:', tools.tools.map(tool => tool.name));
if (tools.tools.length > 0) {
// Call the first tool
const toolName = tools.tools[0].name;
const result = await client.callTool(toolName, {
// Tool parameters
param1: 'value1',
param2: 'value2'
});
// Process results
for (const content of result.contents) {
if (content.type === 'text') {
console.log('Result:', content.text);
}
}
}
} finally {
// Close client connection
await client.close();
}
}
main().catch(console.error);
For more details and advanced usage, please refer to the TypeScript SDK Official Repository.
Java SDK
Java SDK provides functionality for using MCP in Java applications, with good integration with Spring AI.
Adding Dependencies
<!-- Maven -->
<dependency>
<groupId>io.modelcontextprotocol</groupId>
<artifactId>mcp-java-sdk</artifactId>
<version>latest-version</version>
</dependency>
Or
// Gradle
implementation 'io.modelcontextprotocol:mcp-java-sdk:latest-version'
Usage Example
import io.modelcontextprotocol.client.MCPClient;
import io.modelcontextprotocol.client.MCPClientBuilder;
import io.modelcontextprotocol.types.CallToolResult;
import io.modelcontextprotocol.types.ListToolsResult;
import io.modelcontextprotocol.types.Tool;
import io.modelcontextprotocol.types.Content;
import java.util.HashMap;
import java.util.Map;
public class MCPExample {
public static void main(String[] args) throws Exception {
// Create MCP client
try (MCPClient client = MCPClientBuilder.create("https://api.deepcore.top/mcp/mcpAggService").build()) {
// Get available tools list
ListToolsResult toolsResult = client.listTools();
System.out.println("Available tools:");
for (Tool tool : toolsResult.getTools()) {
System.out.println("- " + tool.getName() + ": " + tool.getDescription());
}
if (!toolsResult.getTools().isEmpty()) {
// Call the first tool
String toolName = toolsResult.getTools().get(0).getName();
Map<String, Object> arguments = new HashMap<>();
arguments.put("param1", "value1");
arguments.put("param2", "value2");
CallToolResult result = client.callTool(toolName, arguments);
// Process results
for (Content content : result.getContents()) {
if ("text".equals(content.getType())) {
System.out.println("Result: " + content.getText());
}
}
}
}
}
}
For more details and advanced usage, please refer to the Java SDK Official Repository.
Advantages of MCP Services
Using DeepCore MCP services offers the following advantages:
Standardized Interface: Use a unified protocol to connect different data sources
Secure Access: Controlled data access methods
Context Preservation: AI models can maintain context across different tools
Simplified Integration: No need to maintain separate connectors for each data source
Scalability: Easily add new tools and data sources
Practical Application Scenarios
1. Knowledge Base Access
Create MCP services connected to enterprise knowledge bases, allowing AI models to query and retrieve specific information.
2. Data Analysis Tools
Encapsulate data analysis functionality as MCP tools, enabling AI models to execute data queries and analysis.
3. Internal System Integration
Package internal system APIs as MCP services, giving AI assistants access to these functionalities.
4. Specialized Domain Tools
Create collections of specialized domain tools (such as finance, healthcare) and provide them to AI models through MCP.
Best Practices
Tool Design: Design simple, single-function tools rather than complex multi-function tools
Provide Clear Descriptions: Give clear descriptions for tools and parameters
Add Prompt Templates: Help AI models better understand and use tools
Resource Management: Provide commonly used information as resources to reduce repeated queries
Permission Control: Pay attention to access permission control for MCP services
Limitations and Considerations
MCP services currently only support synchronous calls, not long-running asynchronous operations
There are size limitations on data returned by tools; avoid returning overly large datasets
Be careful to protect sensitive information and authentication credentials
MCP server instances are created with each request, they don't maintain long-term state
Troubleshooting
Common issues and solutions:
Connection Problems: Check if the MCP service name is correct, confirm the service has been created
Authentication Errors: Verify user permissions and authentication tokens
Tool Call Failures: Check tool configuration and parameter formats
Resource Access Failures: Confirm resources have been added and are in the correct format
Summary
DeepCore MCP services provide a standardized way for AI models to securely and effectively access and operate various tools and data sources. Through simple API calls, developers can create powerful MCP services, extending the capability range of AI models and building smarter, more practical AI applications.
Last updated