MCP Hub
Back to servers

agent-studio-starter

Stop building AI agents from scratch. Bootstrap starter Agent app with LangGraph, CopilotKit, and beautiful generative UIs.

Stars
13
Forks
2
Updated
Feb 23, 2026
Validated
Mar 3, 2026

LangGraph Deep Agents + CopilotKit Generative UI Template

License Ask DeepWiki

Backenduv Ruff Mypy Python 3.13
FrontendNextJs

A starter template for building AI agent applications with beautiful generative UIs, combining LangChain Deep Agents (LangGraph-powered backend) and CopilotKit (React frontend with generative UI). This project demonstrates a weather assistant application that showcases real-time tool calling with custom UI components.

🎯 Project Overview

This is a project bootstrap/demonstration that serves as a template for building AI agent applications with:

  • Backend: Python FastAPI service using LangChain Deep Agents framework for intelligent agent orchestration
  • Frontend: Next.js application with CopilotKit for generative UI and real-time agent interaction
  • Infrastructure: Kubernetes deployment with Skaffold for streamlined local development

Key Features

Deep Agents Framework - Built on LangGraph with support for:

  • Complex multi-step task planning
  • Tool usage and sub-agent delegation
  • Long-term memory with checkpointing
  • Streaming responses

🎨 Generative UI - Dynamic UI components rendered based on agent tool calls:

  • Custom weather cards with animations
  • Real-time status updates
  • Beautiful gradients and responsive design

🚀 Kubernetes-Native - deployment:

  • Container orchestration
  • Service mesh with Skaffold
  • Hot reloading for development
  • Multi-service architecture

🏗️ Architecture

High-Level System Architecture

graph TB
    subgraph "Frontend - Next.js + CopilotKit"
        UI[React UI]
        CK[CopilotKit Runtime]
        RT[useRenderToolCall Hook]
    end
    
    subgraph "Backend - FastAPI + Deep Agents"
        API[FastAPI Server]
        DA[Deep Agent Graph]
        LLM[LLM - ChatLiteLLM]
        TOOLS[Tools - get_weather]
        MEM[Memory Checkpointer]
    end
    
    subgraph "Kubernetes Cluster"
        FS[Frontend Service]
        BS[Backend Service]
    end
    
    UI --> CK
    CK --> |HTTP| API
    API --> DA
    DA --> LLM
    DA --> TOOLS
    DA --> MEM
    CK --> RT
    RT --> |Render Tool Calls| UI
    
    FS --> UI
    BS --> API
    
    style UI fill:#61dafb
    style CK fill:#0ea5e9
    style DA fill:#f59e0b
    style LLM fill:#8b5cf6

Data Flow - Tool Call with Generative UI

sequenceDiagram
    participant User
    participant Frontend
    participant CopilotKit
    participant Backend
    participant DeepAgent
    participant LLM
    participant Tool
    
    User->>Frontend: "What's the weather in SF?"
    Frontend->>CopilotKit: Send message
    CopilotKit->>Backend: POST /api/copilotkit
    Backend->>DeepAgent: Process message
    DeepAgent->>LLM: Generate response
    LLM->>DeepAgent: Call get_weather tool
    DeepAgent->>Tool: get_weather("San Francisco")
    Tool-->>DeepAgent: Weather data (JSON)
    DeepAgent-->>Backend: Tool result
    Backend-->>CopilotKit: Stream response
    CopilotKit->>Frontend: Tool call event
    Frontend->>Frontend: useRenderToolCall renders custom UI
    Frontend-->>User: Beautiful weather card 🌤️

Component Architecture

graph LR
    subgraph "Backend Container :8123"
        MAIN[main.py]
        UTILS[utils.py]
        AGUI[ag_ui_langgraph]
        CKBE[CopilotKit Backend]
        
        MAIN --> UTILS
        MAIN --> AGUI
        UTILS --> |create_deep_agent| DA[Deep Agent Graph]
        AGUI --> |LangGraphAGUIAgent| CKBE
    end
    
    subgraph "Frontend Container :3000"
        PAGE[page.tsx]
        ROUTE[route.ts]
        CKRT[CopilotRuntime]
        UI[Generative UI]
        
        PAGE --> UI
        ROUTE --> CKRT
        CKRT --> |LangGraphHttpAgent| HTTP
    end
    
    HTTP[HTTP Client] --> |http://backend:8123| CKBE
    
    style DA fill:#f59e0b
    style UI fill:#61dafb
    style CKBE fill:#0ea5e9
    style CKRT fill:#0ea5e9

🐍 Backend - Deep Agents with LangGraph

The backend is a FastAPI service that uses the Deep Agents framework - an agent harness built on top of LangGraph for complex, multi-step tasks.

Technology Stack

ComponentTechnologyPurpose
FrameworkFastAPIHigh-performance async API server
Agent FrameworkDeep Agents (0.3.12)Agent orchestration and planning
RuntimeLangGraphDurable execution, streaming, HITL
LLMChatLiteLLMFlexible LLM integration (GitHub Copilot)
IntegrationCopilotKitFrontend-backend agent communication
MemoryMemorySaverConversation state persistence

Project Structure

backend/
├── src/
│   └── agent/
│       ├── main.py          # FastAPI app + CopilotKit integration
│       └── utils.py         # Agent builder + tools
├── tests/
│   └── agent/
│       └── test_main.py     # Unit tests
├── k8s/
│   └── deployment.yaml      # Kubernetes manifests
├── Dockerfile               # Container image
├── pyproject.toml           # Python dependencies
└── Makefile                 # Build and run commands

Key Components

1. Agent Builder (utils.py)

from deepagents import create_deep_agent
from copilotkit import CopilotKitMiddleware

def build_agent():
    agent_graph = create_deep_agent(
        model=ChatLiteLLM(model="github_copilot/gpt-5-mini"),
        tools=[get_weather],
        middleware=[CopilotKitMiddleware()],
        system_prompt="You are a helpful assistant",
        checkpointer=MemorySaver(),
    )
    return agent_graph

Deep Agents Features Used:

  • 🧠 Model: Flexible LLM integration via LiteLLM
  • 🔧 Tools: Custom tool functions (get_weather)
  • 🔌 Middleware: CopilotKit for streaming and UI updates
  • 💾 Checkpointer: Conversation memory across sessions

2. FastAPI Server (main.py)

from ag_ui_langgraph import add_langgraph_fastapi_endpoint
from copilotkit import LangGraphAGUIAgent

app = FastAPI()
agent_graph = build_agent()

add_langgraph_fastapi_endpoint(
    app=app,
    agent=LangGraphAGUIAgent(
        name="weather_application_assistant",
        graph=agent_graph,
    ),
    path="/",
)

The LangGraphAGUIAgent wraps the Deep Agents graph and exposes it via FastAPI endpoints that CopilotKit can connect to.

Running the Backend

# Install dependencies
cd backend
uv sync

# Run locally
uv run python src/agent/main.py

# Run tests
uv run pytest

# Build Docker image
make build

The backend listens on http://0.0.0.0:8123 and exposes:

  • / - LangGraph agent endpoints
  • /healthz - Health check endpoint

⚛️ Frontend - Next.js with CopilotKit Generative UI

The frontend is a Next.js application that uses CopilotKit to create beautiful generative UIs that respond to agent tool calls in real-time.

Technology Stack

ComponentTechnologyPurpose
FrameworkNext.js 16React framework with App Router
UI LibraryCopilotKit (1.51.4)Agent integration + generative UI
StylingTailwind CSSUtility-first styling
Agent ClientLangGraphHttpAgentHTTP client for backend connection
LanguageTypeScriptType-safe development

Project Structure

frontend/
├── src/
│   └── app/
│       ├── page.tsx              # Main page with generative UI
│       ├── layout.tsx            # App layout with CopilotKit provider
│       ├── globals.css           # Global styles
│       └── api/
│           └── copilotkit/
│               └── route.ts      # CopilotKit API endpoint
├── k8s/
│   └── deployment.yaml           # Kubernetes manifests
├── Dockerfile                    # Container image
├── package.json                  # Node dependencies
└── next.config.ts                # Next.js configuration

Key Components

1. CopilotKit Runtime (route.ts)

import { CopilotRuntime } from "@copilotkit/runtime";
import { LangGraphHttpAgent } from "@copilotkit/runtime/langgraph";

const runtime = new CopilotRuntime({
  agents: {
    weather_assistant: new LangGraphHttpAgent({
      url: process.env.LANGGRAPH_DEPLOYMENT_URL || "http://localhost:8123",
    }),
  },
});

The LangGraphHttpAgent connects to the backend FastAPI server and streams tool calls and responses.

2. Generative UI with useRenderToolCall (page.tsx)

import { useRenderToolCall } from "@copilotkit/react-core";

useRenderToolCall({
  name: "get_weather",
  render: ({status, args, result}) => {
    const weatherData = JSON.parse(result);
    
    return (
      <div className="weather-card">
        <h3>{weatherData.location}</h3>
        <div className="temp">{weatherData.temperature}°{weatherData.unit}</div>
        <p>{weatherData.weather}</p>
      </div>
    );
  }
});

Generative UI Features:

  • 🎨 Custom Rendering: Fully customizable UI components
  • Real-time Updates: Status changes as tool executes
  • 🔄 Loading States: Built-in loading animations
  • 📊 Rich Data Display: Parse and display structured data

3. Weather Card Component

The weather card demonstrates advanced generative UI features:

  • Dynamic gradients based on weather conditions
  • Weather icons (☀️, ☁️, 🌧️, ❄️, ⛈️)
  • Animated loading states
  • Responsive hover effects
  • Real-time status indicators

Running the Frontend

# Install dependencies
cd frontend
npm install

# Run development server
npm run dev

# Build for production
npm run build

# Start production server
npm start

The frontend runs on http://localhost:3000 and connects to the backend via the LANGGRAPH_DEPLOYMENT_URL environment variable.


☸️ Kubernetes & Skaffold Deployment

This project uses Skaffold for streamlined Kubernetes development with hot reloading and automatic rebuilds.

Architecture

graph TB
    subgraph "Kubernetes Cluster"
        subgraph "Frontend Service"
            FP[Port 3000]
            FD[Deployment: frontend]
            FC[Container: Next.js]
            
            FP --> FD
            FD --> FC
        end
        
        subgraph "Backend Service"
            BP[Port 8123]
            BD[Deployment: backend]
            BC[Container: FastAPI]
            
            BP --> BD
            BD --> BC
        end
        
        FC --> |HTTP| BP
    end
    
    DEV[Developer] --> |skaffold dev| Skaffold
    Skaffold --> |Build Images| Docker
    Skaffold --> |Deploy| K8S[kubectl]
    K8S --> FD
    K8S --> BD
    Skaffold --> |Port Forward| FP
    DEV --> |http://localhost:3000| FP
    
    style FD fill:#61dafb
    style BD fill:#f59e0b
    style Skaffold fill:#0ea5e9

Skaffold Configuration

The skaffold.yaml file defines the build and deployment pipeline:

build:
  artifacts:
    - image: frontend
      sync:
        infer:
          - "**/*.ts"
          - "**/*.tsx"
          - "**/*.css"
    
    - image: backend
      sync:
        infer:
          - "**/*.py"

manifests:
  rawYaml:
    - frontend/k8s/deployment.yaml
    - backend/k8s/deployment.yaml

portForward:
  - resourceType: service
    resourceName: frontend
    port: 3000

Key Features

FeatureDescription
Hot ReloadFile sync for .ts, .tsx, .py files - no rebuild needed
Auto RebuildAutomatic Docker image rebuild on code changes
Port ForwardingAccess frontend at localhost:3000
Service DiscoveryBackend accessible at http://backend:8123 from frontend
Local DevelopmentFull Kubernetes environment on your machine

Kubernetes Resources

Backend Service (backend/k8s/deployment.yaml)

apiVersion: v1
kind: Service
metadata:
  name: backend
spec:
  ports:
    - port: 8123
  selector:
    app: backend
---
apiVersion: apps/v1
kind: Deployment
metadata:
  name: backend
spec:
  replicas: 1
  template:
    spec:
      containers:
        - name: backend
          image: backend

Frontend Service (frontend/k8s/deployment.yaml)

apiVersion: v1
kind: Service
metadata:
  name: frontend
spec:
  ports:
    - port: 3000
  selector:
    app: frontend
---
apiVersion: apps/v1
kind: Deployment
metadata:
  name: frontend
spec:
  replicas: 1
  template:
    spec:
      containers:
        - name: frontend
          image: frontend
          env:
            - name: LANGGRAPH_DEPLOYMENT_URL
              value: "http://backend:8123"

Development Workflow

# Start development environment
skaffold dev

# This will:
# 1. Build Docker images for frontend and backend
# 2. Deploy to Kubernetes
# 3. Set up port forwarding
# 4. Watch for file changes and hot reload

# Access the application
open http://localhost:3000

# Clean up
# Press Ctrl+C to stop skaffold
# Resources are automatically deleted

Production Deployment

For production, you would:

  1. Build and push images to a container registry
  2. Update image references in deployment manifests
  3. Apply manifests to production cluster
  4. Configure ingress for external access
# Build production images
docker build -t your-registry/backend:v1.0 ./backend
docker build -t your-registry/frontend:v1.0 ./frontend

# Push to registry
docker push your-registry/backend:v1.0
docker push your-registry/frontend:v1.0

# Deploy to production
kubectl apply -f backend/k8s/deployment.yaml
kubectl apply -f frontend/k8s/deployment.yaml

🚀 Getting Started

Prerequisites

  • Python 3.13+ and uv for backend
  • Node.js 20+ and npm for frontend
  • Docker for containerization
  • Local Kubernetes cluster (Minikube or another)
  • Skaffold CLI for development workflow
  • API Keys: GitHub Copilot or other LLM provider

Quick Start

  1. Clone the repository
git clone https://github.com/nsphung/agent-studio-starter.git
cd agent-studio-starter
  1. Start with Skaffold
# Make sure Local Kubernetes is running (Minikube or another)
skaffold dev
  1. Access the application

Open your browser to http://localhost:3000 and start chatting!

Try asking:

  • "What's the weather in San Francisco?"
  • "Tell me the weather in Tokyo"
  • "How's the weather in London?"

Manual Setup (Without Skaffold)

Backend

cd backend
uv sync
uv run python src/agent/main.py

Frontend

cd frontend
npm install
npm run dev

📁 Project Structure

agent-studio-starter/
├── backend/                  # Python FastAPI backend
│   ├── src/agent/           # Agent code
│   ├── tests/               # Unit tests
│   ├── k8s/                 # Kubernetes manifests
│   ├── Dockerfile           # Container image
│   └── pyproject.toml       # Dependencies
│
├── frontend/                # Next.js frontend
│   ├── src/app/            # Next.js app
│   ├── k8s/                # Kubernetes manifests
│   ├── Dockerfile          # Container image
│   └── package.json        # Dependencies
│
├── notebooks/              # Jupyter notebooks for evaluation
│   └── evaluate.ipynb      # Agent evaluation
│
├── skaffold.yaml          # Skaffold configuration
├── Makefile               # Build commands
└── README.md              # This file

🎓 Learning Resources

LangChain Deep Agents

CopilotKit

Kubernetes & Skaffold


🔧 Customization Guide

Adding New Tools

  1. Define the tool in backend/src/agent/utils.py:
def my_custom_tool(param: str) -> str:
    """Tool description for the LLM."""
    # Your tool logic
    return result
  1. Add to agent tools:
agent_graph = create_deep_agent(
    model=model,
    tools=[get_weather, my_custom_tool],  # Add your tool
    middleware=[CopilotKitMiddleware()],
)
  1. Create generative UI in frontend/src/app/page.tsx:
useRenderToolCall({
  name: "my_custom_tool",
  render: ({status, args, result}) => (
    <YourCustomComponent data={result} />
  )
});

Switching LLM Providers

Update the model in backend/src/agent/utils.py:

# OpenAI
model = ChatLiteLLM(model="gpt-4")

# Anthropic
model = ChatLiteLLM(model="anthropic/claude-3-5-sonnet")

# Azure OpenAI
model = ChatLiteLLM(model="azure/gpt-4")

Adding Persistent Storage

Update the checkpointer to use PostgreSQL or other backends:

from langgraph.checkpoint.postgres import PostgresSaver

checkpointer = PostgresSaver(connection_string="postgresql://...")

🧪 Testing

Backend Tests

cd backend
uv run pytest

Frontend Tests

cd frontend
npm test

Agent Evaluation

Use the Jupyter notebook for agent evaluation:

jupyter notebook notebooks/evaluate.ipynb

🤝 Contributing

This is a template project designed to be forked and customized for your own use cases. Feel free to:

  • Add new tools and capabilities
  • Enhance the UI with more generative components
  • Integrate with external APIs
  • Add authentication and authorization
  • Deploy to production Kubernetes clusters

📝 License

See LICENSE file for details.


💡 Use Cases

This template can be adapted for various AI agent applications:

  • 🌐 Research Assistants - Web search and document analysis
  • 📊 Data Analysis Tools - Query databases and visualize results
  • 🛒 E-commerce Assistants - Product search and recommendations
  • 📧 Email Automation - Draft and send emails
  • 📅 Scheduling Agents - Calendar management
  • 🔍 Code Analysis - Review and explain code
  • 📈 Financial Advisors - Market data and portfolio analysis

🙏 Acknowledgments

Built with amazing open-source technologies:


Happy Building! 🚀

For questions or issues, please open an issue on GitHub.

Reviews

No reviews yet

Sign in to write a review