MCP Hub
Back to servers

guidance-lark-mcp

Validate and test llguidance grammars with batch testing and documentation

Registry
Updated
Mar 6, 2026

Quick Install

uvx guidance-lark-mcp

MCP Grammar Tools

MCP server for validating and testing llguidance grammars (Lark format). Provides grammar validation, batch test execution, and syntax documentation — ideal for iteratively building grammars with AI coding assistants.

Installation

With uvx (recommended)

uvx guidance-lark-mcp

With pip

pip install guidance-lark-mcp

From source

cd mcp-grammar-tools
pip install -e .

MCP Client Configuration

GitHub Copilot CLI

You can add the server using the interactive /mcp add command or by editing the config file directly. See the Copilot CLI MCP documentation for full details.

Option 1: Interactive setup

In the Copilot CLI, run /mcp add, select Local/STDIO, and enter uvx guidance-lark-mcp as the command.

Option 2: Edit config file

Add the following to ~/.copilot/mcp-config.json:

{
  "mcpServers": {
    "grammar-tools": {
      "type": "local",
      "command": "uvx",
      "args": ["guidance-lark-mcp"],
      "tools": ["*"]
    }
  }
}

This gives you grammar validation and batch testing out of the box. To also enable LLM-powered generation (generate_with_grammar), add ENABLE_GENERATION and your credentials to env:

"env": {
  "ENABLE_GENERATION": "true",
  "OPENAI_API_KEY": "your-key-here"
}

For Azure OpenAI (with Entra ID via az login), use guidance-lark-mcp[azure] and set the endpoint instead:

"args": ["guidance-lark-mcp[azure]"],
"env": {
  "ENABLE_GENERATION": "true",
  "AZURE_OPENAI_ENDPOINT": "https://your-resource.openai.azure.com/",
  "OPENAI_MODEL": "your-deployment-name"
}

See Backend Configuration for all supported backends.

After saving, use /mcp show to verify the server is connected.

VS Code

{
  "mcpServers": {
    "grammar-tools": {
      "type": "local",
      "command": "uvx",
      "args": ["guidance-lark-mcp"],
      "env": {
        "ENABLE_GENERATION": "true",
        "OPENAI_API_KEY": "your-key-here"
      },
      "tools": ["*"]
    }
  }
}

Claude Desktop

{
  "mcpServers": {
    "grammar-tools": {
      "command": "uvx",
      "args": ["guidance-lark-mcp"],
      "env": {
        "ENABLE_GENERATION": "true",
        "OPENAI_API_KEY": "your-key-here"
      }
    }
  }
}

Usage

Available Tools

  1. validate_grammar — Validate grammar completeness and consistency using llguidance's built-in validator.

    {"grammar": "start: \"hello\" \"world\""}
    
  2. run_batch_validation_tests — Run batch validation tests from a JSON file against a grammar. Returns pass/fail statistics and detailed failure info.

    {
      "grammar": "start: /[0-9]+/",
      "test_file": "tests.json"
    }
    

    Test file format:

    [
      {"input": "123", "should_parse": true, "description": "Valid number"},
      {"input": "abc", "should_parse": false, "description": "Not a number"}
    ]
    
  3. get_llguidance_documentation — Fetch the llguidance grammar syntax documentation from the official repo.

  4. generate_with_grammar (optional, requires ENABLE_GENERATION=true) — Generate text using an OpenAI model constrained by a grammar. Uses the Responses API with custom tool grammar format, so output is guaranteed to conform to the grammar. Requires OPENAI_API_KEY environment variable. See Backend Configuration for Azure and other endpoints.

Backend Configuration

The generate_with_grammar tool uses the OpenAI Python SDK, which natively supports multiple backends via environment variables:

BackendRequired env varsOptional env vars
OpenAI (default)OPENAI_API_KEYOPENAI_MODEL
Azure OpenAI (API key)AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEYAZURE_OPENAI_API_VERSION, OPENAI_MODEL
Azure OpenAI (Entra ID)AZURE_OPENAI_ENDPOINT + az loginAZURE_OPENAI_API_VERSION, OPENAI_MODEL
Custom endpointOPENAI_API_KEY, OPENAI_BASE_URLOPENAI_MODEL

The server auto-detects which backend to use:

  • If AZURE_OPENAI_ENDPOINT is set → uses AzureOpenAI client (with Entra ID or API key)
  • Otherwise → uses OpenAI client (reads OPENAI_API_KEY and OPENAI_BASE_URL automatically)

The server logs which backend it detects on startup.

Example: Azure OpenAI (API key)

{
  "mcpServers": {
    "grammar-tools": {
      "type": "local",
      "command": "uvx",
      "args": ["guidance-lark-mcp"],
      "env": {
        "ENABLE_GENERATION": "true",
        "AZURE_OPENAI_ENDPOINT": "https://my-resource.openai.azure.com",
        "AZURE_OPENAI_API_KEY": "your-azure-key",
        "OPENAI_MODEL": "gpt-4.1"
      },
      "tools": ["*"]
    }
  }
}

Example: Azure OpenAI (Entra ID / keyless)

Requires az login and the azure extra: pip install guidance-lark-mcp[azure]

{
  "mcpServers": {
    "grammar-tools": {
      "type": "local",
      "command": "uvx",
      "args": ["guidance-lark-mcp[azure]"],
      "env": {
        "ENABLE_GENERATION": "true",
        "AZURE_OPENAI_ENDPOINT": "https://my-resource.openai.azure.com",
        "OPENAI_MODEL": "gpt-4.1"
      },
      "tools": ["*"]
    }
  }
}

Example Workflow

Build a grammar iteratively with an AI assistant:

  1. Start with the spec — paste EBNF rules from a language specification
  2. Write a basic grammar — translate a few rules to Lark format
  3. Validate — use validate_grammar to check for missing rules
  4. Write tests — create a JSON test file with sample inputs
  5. Batch test — use run_batch_validation_tests to find failures
  6. Fix & repeat — refine the grammar until all tests pass

Example Grammars

The examples/ directory includes sample grammars built using these tools, with Lark grammar files, test suites, and documentation:

  • GraphQL — executable subset of the GraphQL spec (queries, mutations, fragments, variables)

Troubleshooting

Server fails to connect in Copilot CLI / VS Code?

MCP clients like Copilot CLI only show "Connection closed" when a server crashes on startup. To see the actual error, run the server directly in your terminal:

uvx guidance-lark-mcp

Or with generation enabled:

ENABLE_GENERATION=true OPENAI_API_KEY=your-key uvx guidance-lark-mcp

Common issues:

  • Missing credentialsENABLE_GENERATION=true without a valid OPENAI_API_KEY or AZURE_OPENAI_ENDPOINT. The server will still start and serve validation tools; generate_with_grammar will return a descriptive error.
  • Azure Entra ID — make sure you've run az login and are using guidance-lark-mcp[azure] (not the base package).
  • Slow first startuvx needs to resolve and install dependencies on first run, which may exceed the MCP client's connection timeout. Run uvx guidance-lark-mcp once manually to warm the cache.
  • Updating to a new versionuvx caches packages, so after a new release you may need to clear the cache and restart your MCP client:
    uv cache clean guidance-lark-mcp
    

Development

git clone https://github.com/guidance-ai/guidance-lark-mcp
cd guidance-lark-mcp
uv sync
uv run pytest tests/ -q

Reviews

No reviews yet

Sign in to write a review