MCP Server - Z.ai Integration
A Model Context Protocol (MCP) server that provides filesystem access and Z.ai GLM-4 model integration for MCP clients.
Features
- Filesystem Access: Read/write files in the parent repository
- Z.ai Integration: Use local Z.ai API key with GLM-4 models for code generation
- Submodule Detection: Automatically detects submodule environment and sets working root to parent directory
Installation as Git Submodule
cd your-parent-project
git submodule add https://github.com/your-username/mcp-server.git mcp-server
cd mcp-server
pip install -r requirements.txt
cp .env.example .env
Edit .env and add your Z.ai API key:
ZAI_API_KEY=your_actual_api_key
MCP Client Configuration
Claude Desktop
Add to your Claude Desktop configuration file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"zai-server": {
"command": "python",
"args": ["mcp-server/src/main.py"],
"env": {
"ZAI_API_KEY": "your_api_key_here"
}
}
}
}
Tools
read_file
Read the contents of a file relative to the parent repository root.
write_file
Write content to a file relative to the parent repository root.
zai_generate
Generate code or perform reasoning tasks using Z.ai GLM-4 models.
Parameters:
prompt(string): The prompt to send to the modelmodel(string, optional): Model identifier (default: "glm-4")temperature(number, optional): Sampling temperature (default: 0.7)max_tokens(number, optional): Maximum tokens to generate (default: 2000)
Usage Example
Once configured, you can use the tools from your MCP client:
User: Read the main.py file
Assistant: [Uses read_file tool]
User: Generate a sorting algorithm using Z.ai
Assistant: [Uses zai_generate tool with prompt about sorting algorithms]
License
MIT