MCP Server Development Protocol
This protocol is designed to streamline the development process of building MCP servers with Cline.
🚀 Build and share your MCP servers with the world. Once you’ve created a great MCP server, submit it to the Cline MCP Marketplace to make it discoverable and one-click installable by thousands of developers.
What Are MCP Servers?
Model Context Protocol (MCP) servers extend AI assistants like Cline by giving them the ability to:
- Access external APIs and services
- Retrieve real-time data
- Control applications and local systems
- Perform actions beyond what text prompts alone can achieve
Without MCP, AI assistants are powerful but isolated. With MCP, they gain the ability to interact with virtually any digital system.
The Development Protocol
The heart of effective MCP server development is following a structured protocol. This protocol is implemented through a .clinerules
file that lives at the root of your MCP working directory (/Users/your-name/Documents/Cline/MCP).
Using .clinerules
Files
A .clinerules
file is a special configuration that Cline reads automatically when working in the directory where it’s placed. These files:
- Configure Cline’s behavior and enforce best practices
- Switch Cline into a specialized MCP development mode
- Provide a step-by-step protocol for building servers
- Implement safety measures like preventing premature completion
- Guide you through planning, implementation, and testing phases
Here’s the complete MCP Server Development Protocol that should be placed in your .clinerules
file:
When this .clinerules
file is present in your working directory, Cline will:
- Start in PLAN MODE to design your server before implementation
- Enforce proper implementation patterns in ACT MODE
- Require testing of all tools before allowing completion
- Guide you through the entire development lifecycle
Getting Started
Creating an MCP server requires just a few simple steps to get started:
1. Create a .clinerules
file (🚨 IMPORTANT)
First, add a .clinerules
file to the root of your MCP working directory using the protocol above. This file configures Cline to use the MCP development protocol when working in this folder.
2. Start a Chat with a Clear Description
Begin your Cline chat by clearly describing what you want to build. Be specific about:
- The purpose of your MCP server
- Which API or service you want to integrate with
- Any specific tools or features you need
For example:
3. Work Through the Protocol
Cline will automatically start in PLAN MODE, guiding you through the planning process:
- Discussing the problem scope
- Reviewing API documentation
- Planning authentication methods
- Designing tool interfaces
When ready, switch to ACT MODE using the toggle at the bottom of the chat to begin implementation.
4. Provide API Documentation Early
One of the most effective ways to help Cline build your MCP server is to share official API documentation right at the start:
Providing comprehensive API details (endpoints, authentication, data structures) significantly improves Cline’s ability to implement an effective MCP server.
Understanding the Two Modes
PLAN MODE
In this collaborative phase, you work with Cline to design your MCP server:
- Define the problem scope
- Choose appropriate APIs
- Plan authentication methods
- Design the tool interfaces
- Determine data formats
ACT MODE
Once planning is complete, Cline helps implement the server:
- Set up the project structure
- Write the implementation code
- Configure settings
- Test each component thoroughly
- Finalize documentation
Case Study: AlphaAdvantage Stock Analysis Server
Let’s walk through the development process of our AlphaAdvantage MCP server, which provides stock data analysis and reporting capabilities.
Planning Phase
During the planning phase, we:
- Defined the problem: Users need access to financial data, stock analysis, and market insights directly through their AI assistant
- Selected the API: AlphaAdvantage API for financial market data
- Standard API key authentication
- Rate limits of 5 requests per minute (free tier)
- Various endpoints for different financial data types
- Designed the tools needed:
- Stock overview information (current price, company details)
- Technical analysis with indicators (RSI, MACD, etc.)
- Fundamental analysis (financial statements, ratios)
- Earnings report data
- News and sentiment analysis
- Planned data formatting:
- Clean, well-formatted markdown output
- Tables for structured data
- Visual indicators (↑/↓) for trends
- Proper formatting of financial numbers
Implementation
We began by bootstrapping the project:
Next, we structured our project with:
API Client Implementation
The API client implementation included:
- Rate limiting: Enforcing the 5 requests per minute limit
- Caching: Reducing API calls with strategic caching
- Error handling: Robust error detection and reporting
- Typed interfaces: Clear TypeScript types for all data
Key implementation details:
Markdown Formatting
We implemented formatters to display financial data beautifully:
Tool Implementation
We defined five tools with clear interfaces:
Each tool’s handler included:
- Input validation
- API client calls with error handling
- Markdown formatting of responses
- Comprehensive logging
Testing Phase
This critical phase involved systematically testing each tool:
- First, we configured the MCP server in the settings:
- Then we tested each tool individually:
-
get_stock_overview: Retrieved AAPL stock overview information
-
get_technical_analysis: Obtained price action and RSI data
-
get_earnings_report: Retrieved MSFT earnings history and formatted report
Challenges and Solutions
During development, we encountered several challenges:
- API Rate Limiting:
- Challenge: Free tier limited to 5 calls per minute
- Solution: Implemented queuing, enforced rate limits, and added comprehensive caching
- Data Formatting:
- Challenge: Raw API data not user-friendly
- Solution: Created formatting utilities for consistent display of financial data
- Timeout Issues:
- Challenge: Complex tools making multiple API calls could timeout
- Solution: Suggested breaking complex tools into smaller pieces, optimizing caching
Lessons Learned
Our AlphaAdvantage implementation taught us several key lessons:
- Plan for API Limits: Understand and design around API rate limits from the beginning
- Cache Strategically: Identify high-value caching opportunities to improve performance
- Format for Readability: Invest in good data formatting for improved user experience
- Test Every Path: Test all tools individually before completion
- Handle API Complexity: For APIs requiring multiple calls, design tools with simpler scopes
Core Implementation Best Practices
Comprehensive Logging
Effective logging is essential for debugging MCP servers:
Strong Typing
Type definitions prevent errors and improve maintainability:
Intelligent Caching
Reduce API calls and improve performance:
Graceful Error Handling
Implement robust error handling that maintains a good user experience:
MCP Resources
Resources let your MCP servers expose data to Cline without executing code. They’re perfect for providing context like files, API responses, or database records that Cline can reference during conversations.
Adding Resources to Your MCP Server
- Define the resources your server will expose:
- Implement read handlers to deliver the content:
Resources make your MCP servers more context-aware, allowing Cline to access specific information without requiring you to copy/paste. For more information, refer to the official documentation.
Common Challenges and Solutions
API Authentication Complexities
Challenge: APIs often have different authentication methods.
Solution:
- For API keys, use environment variables in the MCP configuration
- For OAuth, create a separate script to obtain refresh tokens
- Store sensitive tokens securely
Missing or Limited API Features
Challenge: APIs may not provide all the functionality you need.
Solution:
- Implement fallbacks using available endpoints
- Create simulated functionality where necessary
- Transform API data to match your needs
API Rate Limiting
Challenge: Most APIs have rate limits that can cause failures.
Solution:
- Implement proper rate limiting
- Add intelligent caching
- Provide graceful degradation
- Add transparent errors about rate limits