MCP Hub
Back to servers

Google Gemini

A Python-based MCP server built with FastMCP that integrates Google's Gemini AI models into the Model Context Protocol ecosystem using an OpenAI-compatible API endpoint. It allows tools and applications to communicate with Gemini for advanced text generation and reasoning tasks.

Stars
1
Updated
Jul 2, 2025
Validated
Jan 11, 2026

Gemimi MCP Server (in Python)

Model Context Protocol (MCP) server for Gemimi integration, built on FastMCP.

This server is implemented in Python, with fastmcp.

Quick Start

  1. Build the Docker image:
docker build -t gemini-mcp-server .

Integration with Cursor/Claude

In MCP Settings -> Add MCP server, add this config:

{
  "mcpServers": {
    "gemini": {
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "-i",
        "--network",
        "host",
        "-e",
        "GEMINI_API_KEY",
        "-e",
        "GEMINI_MODEL",
        "-e",
        "GEMINI_BASE_URL",
        "-e",
        "HTTP_PROXY",
        "-e",
        "HTTPS_PROXY",
        "gemini-mcp-server:latest"
      ],
      "env": {
        "GEMINI_API_KEY":"your_api_key_here",
        "GEMINI_MODEL":"gemini-2.5-flash",
        "GEMINI_BASE_URL":"https://generativelanguage.googleapis.com/v1beta/openai/",
        "HTTP_PROXY":"http://127.0.0.1:17890",
        "HTTPS_PROXY":"http://127.0.0.1:17890"

      }
    }
  }
}

Note: Don't forget to replace GEMINI_API_KEYGEMINI_MODELGEMINI_BASE_URLHTTP_PROXYHTTPS_PROXY values with your actual Gemimi credentials and instance URL.

Reviews

No reviews yet

Sign in to write a review