wave-pulseDeepCore MCP

MCP THE FUTURE

DeepCore MCP Documentation

What is MCP

MCP (Model Context Protocol) is an open standard for building secure bi-directional connections, allowing AI models to interact with data sources. Through MCP, developers can:

  • Expose their data to AI models via MCP servers

  • Build AI applications (MCP clients) that connect to these servers

  • Enable AI models to maintain context across different tools and datasets

MCP addresses the challenges of AI systems accessing data, replacing traditional fragmented integration approaches with a universal open standard.

MCP Implementation in DeepCore

DeepCore implements complete MCP server functionality, allowing users to package their tools and data as MCP services for AI models to use. In DeepCore, MCP services are divided into two types:

  1. Built-in MCP services: System-predefined services such as coin-api

  2. MCP Tool Importing: Support for importing tools from external MCP services into the DeepCore tool marketplace, enabling easy integration of third-party capabilities

  3. Dynamic MCP services: Custom MCP services created by users through the API

Core Components

DeepCore's MCP implementation contains the following core components:

  • MCP Server: Provides the ability to interact with AI models through SSE (Server-Sent Events)

  • MCP Tools: Functional units encapsulated in the MCP server, typically representing API calls

  • MCP Prompt Templates: Prompt templates that help AI better use MCP tools

  • MCP Resources: Static content that MCP services can access

MCP Server Architecture

DeepCore's MCP implementation uses SSE as the transport layer, based on the following architecture:

  1. AI model/client initiates SSE connection to the MCP server endpoint

  2. Client can request tool lists, call tools, get prompts, access resources, etc.

  3. MCP server processes requests and returns responses via SSE stream

  4. All interactions use standardized MCP formats

Token Authentication

DeepCore uses API Token authentication for accessing all MCP-related APIs. This method is simple to implement and suitable for both development and production environments.

For detailed information on authentication methods, obtaining and using API Tokens, please refer to the DeepCore API Documentationarrow-up-right.

Prerequisites

  • Registered DeepCore account and obtained API token

  • Created or accessed certain tools (via the /api/tools endpoint)

Importing MCP Service Tools

DeepCore provides API interfaces that allow users to import tools from existing MCP services. This makes it easy to integrate functionality from third-party MCP services into your own DeepCore tool set for use in custom AI applications.

Process for Importing MCP Tools

  1. Parse MCP service URL: Use the /api/tools/parse-mcp interface to analyze an MCP service and obtain its tool definitions

  2. Batch create tools: Use the /api/tools/create-batch interface to batch import the parsed tools into your account

Example: Parsing MCP Services

The following example shows how to parse an existing MCP service and get its tool definitions:

Example: Batch Importing Tools

After obtaining tool definitions by parsing an MCP service, you can use the following code to batch import these tools into your account:

Benefits of Importing MCP Tools

  1. Functionality Reuse: No need to redevelop existing MCP tool functionality

  2. Service Integration: Combine tools from multiple MCP services into one custom MCP service

  3. Feature Customization: Selectively import needed tools, reject unwanted ones

  4. Permission Control: Apply your own access control policies to imported tools

  5. Better Integration: Seamlessly integrate third-party MCP service tools with your own developed tools

Creating MCP Services

Steps to Create an MCP Service

  1. Select tools: Determine which tools to expose in the MCP service

  2. Create MCP server: Call the /api/mcp/create API to create an MCP server

  3. Add prompt templates (optional): Add prompt templates via /api/mcp/{mcp_name}/prompts

  4. Add resources (optional): Add resources via /api/mcp/{mcp_name}/resources

Getting Tool IDs

When creating an MCP service, you need to provide a list of tool IDs to include. These IDs must be existing tool IDs in the DeepCore platform. Here's how to get available tool IDs:

Example: Creating an MCP Service

Here's a complete example showing how to create an MCP service using an API Token:

Example: Adding Prompt Templates

Example: Adding Resources

MCP Server Internal Implementation

DeepCore's MCP server internal implementation is based on the Python MCP server library and integrated with FastAPI. Main components include:

  1. SseServerTransport: Provides an SSE-based transport layer

  2. Server: MCP server core class, handles requests and responses

  3. Handlers: Registered handler functions, including:

    • list_tools - List available tools

    • call_tool - Call tools

    • list_prompts - List prompt templates

    • get_prompt - Get specific prompts

    • list_resources - List resources

    • read_resource - Read resource content

Core Implementation

DeepCore's MCP implementation encapsulates tools into MCP tool format, automatically converting parameter structures and response formats to ensure compliance with MCP specifications. Each MCP service is stored in the database, including basic service information, associated tools, prompts, and resources.

Dynamic MCP service creation process:

  1. Create MCPServer record in the database

  2. Associate MCPTool records pointing to existing tools

  3. Optionally add MCPPrompt and MCPResource records

  4. Process requests through routing, dynamically creating MCP server instances

Using MCP Services

Client Connection

AI models or clients can connect to MCP server endpoints via SSE:

Available Operations

Once connected, clients can perform the following operations:

  1. List tools: Get list of available tools

  2. Call tools: Execute tool operations and get results

  3. List prompts: Get list of available prompt templates

  4. Get prompts: Get specific prompt templates and fill variables

  5. List resources: Get list of available resources

  6. Read resources: Get resource content

Example: AI Model Using MCP Service

When an AI model (like Claude) connects to an MCP service, it can:

Using Python Client to Connect to MCP Services

DeepCore MCP services can be connected to and used through various client libraries. Below is how to use the official mcp library and mirascope library to connect to DeepCore-provided MCP services.

Installing Dependencies

First, install the necessary dependency packages:

Example of Using Mirascope to Connect to MCP Services

Here's a basic example showing how to use the mirascope library to connect to a DeepCore MCP service and list available tools:

Example: Calling MCP Tools

The following example shows how to call tools in an MCP service:

Example: Listing and Using Prompt Templates

Example: Reading Resources

Multilingual SDK Support

Model Context Protocol provides SDKs in multiple programming languages that can seamlessly integrate with DeepCore MCP services. Below are the usage methods for various language SDKs, allowing developers to choose their familiar language to use MCP services.

TypeScript/JavaScript SDK

TypeScript SDK is one of the official implementations of Model Context Protocol, suitable for Node.js and browser environments.

Installation

Usage Example

For more details and advanced usage, please refer to the TypeScript SDK Official Repositoryarrow-up-right.

Java SDK

Java SDK provides functionality for using MCP in Java applications, with good integration with Spring AI.

Adding Dependencies

Or

Usage Example

For more details and advanced usage, please refer to the Java SDK Official Repositoryarrow-up-right.

Advantages of MCP Services

Using DeepCore MCP services offers the following advantages:

  1. Standardized Interface: Use a unified protocol to connect different data sources

  2. Secure Access: Controlled data access methods

  3. Context Preservation: AI models can maintain context across different tools

  4. Simplified Integration: No need to maintain separate connectors for each data source

  5. Scalability: Easily add new tools and data sources

Practical Application Scenarios

1. Knowledge Base Access

Create MCP services connected to enterprise knowledge bases, allowing AI models to query and retrieve specific information.

2. Data Analysis Tools

Encapsulate data analysis functionality as MCP tools, enabling AI models to execute data queries and analysis.

3. Internal System Integration

Package internal system APIs as MCP services, giving AI assistants access to these functionalities.

4. Specialized Domain Tools

Create collections of specialized domain tools (such as finance, healthcare) and provide them to AI models through MCP.

Best Practices

  1. Tool Design: Design simple, single-function tools rather than complex multi-function tools

  2. Provide Clear Descriptions: Give clear descriptions for tools and parameters

  3. Add Prompt Templates: Help AI models better understand and use tools

  4. Resource Management: Provide commonly used information as resources to reduce repeated queries

  5. Permission Control: Pay attention to access permission control for MCP services

Limitations and Considerations

  1. MCP services currently only support synchronous calls, not long-running asynchronous operations

  2. There are size limitations on data returned by tools; avoid returning overly large datasets

  3. Be careful to protect sensitive information and authentication credentials

  4. MCP server instances are created with each request, they don't maintain long-term state

Troubleshooting

Common issues and solutions:

  1. Connection Problems: Check if the MCP service name is correct, confirm the service has been created

  2. Authentication Errors: Verify user permissions and authentication tokens

  3. Tool Call Failures: Check tool configuration and parameter formats

  4. Resource Access Failures: Confirm resources have been added and are in the correct format

Summary

DeepCore MCP services provide a standardized way for AI models to securely and effectively access and operate various tools and data sources. Through simple API calls, developers can create powerful MCP services, extending the capability range of AI models and building smarter, more practical AI applications.

Last updated