MCP Hub
Back to servers

notebooklm-mcp-secure

Secure NotebookLM MCP Server - Query Google NotebookLM from Claude/AI agents with 14 security hardening layers

Stars
5
Forks
3
Tools
40
Updated
Jan 4, 2026
Validated
Jan 9, 2026

NotebookLM MCP Server (Security Hardened)

🏆 The World's Most Advanced NotebookLM MCP Server

Zero-hallucination answers • Gemini Deep Research • 14 Security Layers • Enterprise Compliance

npm CalVer TypeScript MCP Platform Security Post-Quantum Gemini Documents Notebooks Compliance Tests

What's New 2026Deep ResearchDocument APICreate NotebooksSecurityInstall

The only NotebookLM MCP with enterprise-grade security, post-quantum encryption, and full Gemini API integration.

Security-hardened fork of PleasePrompto/notebooklm-mcp • Maintained by Pantheon Security


🚀 What's New in 2026

v2026.1.1 brings powerful new capabilities:

FeatureDescription
🔍 Deep Health CheckVerifies NotebookLM chat UI actually loads — catches stale sessions
📊 Chat History ExtractionRecover conversations from browser, with pagination & file export
🎯 Context ManagementPreview mode, offset pagination, output to file — never overflow context
📅 CalVer VersioningModern 2026.MINOR.PATCH format for predictable releases
# Quick install
claude mcp add notebooklm -- npx @pan-sec/notebooklm-mcp@latest

Why Choose This MCP?

CapabilityOther MCPsThis MCP
Query NotebookLM✅ Basic+ session management, quotas
Create notebooks programmaticallyUNIQUE
Gemini Deep ResearchEXCLUSIVE
Document API (no browser)EXCLUSIVE
Post-quantum encryptionFuture-proof
Enterprise complianceGDPR/SOC2/CSSF
Chat history extractionNEW
Deep health verificationNEW

Gemini Deep Research (v1.8.0)

The most powerful research capability for AI agents — now in your MCP toolkit.

v1.8.0 introduces the Gemini Interactions API as a stable, API-based research backend alongside browser automation. This gives your agents access to Google's state-of-the-art Deep Research agent.

Why This Matters

ChallengeSolution
Browser UI changes break automationGemini API is stable and versioned
Need comprehensive research but no research agentDeep Research agent does it for you
Want current information with citationsGoogle Search grounding built-in
Need reliable, fast queriesAPI-based = no UI dependencies

New Tools

deep_research — Comprehensive Research Agent

"Research the security implications of post-quantum cryptography adoption in financial services"
  • Runs Google's Deep Research agent (same as Gemini Advanced)
  • Takes 1-5 minutes for comprehensive, web-grounded analysis
  • Returns structured answers with citations and sources
  • Perfect for complex topics requiring multi-source synthesis

gemini_query — Fast Grounded Queries

"What are the latest CVEs for Log4j in 2025?" (with Google Search)
"Calculate the compound interest on $10,000 at 5% over 10 years" (with code execution)
"Summarize this security advisory: [URL]" (with URL context)
  • Google Search grounding — Current information, not just training data
  • Code execution — Run calculations, data analysis
  • URL context — Analyze web pages on demand
  • Models: gemini-2.5-flash (fast), gemini-2.5-pro (powerful), gemini-3-flash-preview (latest)

get_research_status — Background Task Monitoring

Run deep research in the background and check progress:

"Start researching [topic] in the background"
... continue other work ...
"Check research status for interaction_abc123"

Hybrid Architecture

┌──────────────────────────────────────────────────────────────────────────────┐
│                      NotebookLM MCP Server v2026.1.x                         │
├──────────────────────────────────────────────────────────────────────────────┤
│                                                                              │
│  ┌────────────────────────────────┐    ┌──────────────────────────────────┐  │
│  │      BROWSER AUTOMATION        │    │          GEMINI API              │  │
│  │      (Your Documents)          │    │    (Research & Documents)        │  │
│  ├────────────────────────────────┤    ├──────────────────────────────────┤  │
│  │                                │    │                                  │  │
│  │  QUERY                         │    │  RESEARCH                        │  │
│  │  • ask_question                │    │  • deep_research                 │  │
│  │  • get_notebook_chat_history   │    │  • gemini_query                  │  │
│  │                                │    │  • get_research_status           │  │
│  │  CREATE & MANAGE               │    │                                  │  │
│  │  • create_notebook             │    │  DOCUMENTS                       │  │
│  │  • batch_create_notebooks      │    │  • upload_document               │  │
│  │  • manage_sources              │    │  • query_document                │  │
│  │  • generate_audio              │    │  • query_chunked_document        │  │
│  │  • sync_notebook               │    │  • list/delete_document          │  │
│  │                                │    │                                  │  │
│  │  HEALTH & SESSIONS     v2026   │    │                                  │  │
│  │  • get_health (deep_check)     │    │  Fast API • 48h retention        │  │
│  │  • get_query_history           │    │  Auto-chunking for large PDFs    │  │
│  └────────────────────────────────┘    └──────────────────────────────────┘  │
│                                                                              │
│                      ┌─────────────────────────────────┐                     │
│                      │       14 SECURITY LAYERS        │                     │
│                      │   Post-Quantum • Audit Logs     │                     │
│                      │   Cert Pinning • Memory Wipe    │                     │
│                      │   GDPR • SOC2 • CSSF Ready      │                     │
│                      └─────────────────────────────────┘                     │
└──────────────────────────────────────────────────────────────────────────────┘

Gemini Configuration

# Required for Gemini features
GEMINI_API_KEY=your-api-key          # Get from https://aistudio.google.com/apikey

# Optional settings
GEMINI_DEFAULT_MODEL=gemini-2.5-flash    # Default model
GEMINI_DEEP_RESEARCH_ENABLED=true        # Enable Deep Research
GEMINI_TIMEOUT_MS=30000                  # API timeout

When to Use Which

TaskBest ToolWhy
Questions about YOUR documentsask_questionGrounded on your uploaded sources
Comprehensive topic researchdeep_researchMulti-source synthesis with citations
Current events / recent infogemini_query + google_searchLive web data
Code calculationsgemini_query + code_executionReliable computation
Analyze a webpagegemini_query + url_contextDirect page analysis
Quick PDF/document analysisupload_document + query_documentFast API, no browser (NEW!)

📄 Document API (v1.9.0)

Upload and query documents directly via Gemini API — no browser automation needed.

v1.9.0 introduces the Gemini Files API for fast, reliable document analysis. Upload PDFs, analyze them instantly, and delete when done.

Why This Matters

FeatureBrowser ModeDocument API
SpeedSecondsMilliseconds
ReliabilityUI-dependentAPI-stable
File SupportVia NotebookLM50MB PDFs, 1000 pages
RetentionPermanent48 hours
SetupAuth + cookiesJust API key

New Tools

upload_document — Fast Document Upload

Upload any document to Gemini for instant querying:

Upload /path/to/research-paper.pdf
  • Supported: PDF (50MB, 1000 pages), TXT, MD, HTML, CSV, JSON, DOCX, images, audio, video
  • 48-hour retention — files auto-expire, or delete manually
  • Returns a file ID for querying

query_document — Ask Questions About Documents

"What are the main findings in this research paper?"
"Summarize section 3 of the document"
"Extract all statistics mentioned in the PDF"
  • Full document understanding (text, tables, charts, diagrams)
  • Multi-document queries (compare multiple files)
  • Fast API response (no browser wait)

list_documents — See All Uploaded Files

List all my uploaded documents

Shows file names, sizes, MIME types, and expiration times.

delete_document — Clean Up Sensitive Files

Delete file xyz123

Immediately remove files (don't wait for 48h expiration).

Workflow Example

1. upload_document("/research/paper.pdf")
   → Returns: files/abc123

2. query_document("files/abc123", "What methodology was used?")
   → Returns: "The paper uses a mixed-methods approach combining..."

3. query_document("files/abc123", "List all cited authors")
   → Returns: "Smith et al. (2024), Johnson (2023)..."

4. delete_document("files/abc123")
   → File removed

Auto-Chunking for Large PDFs (v1.10.0)

No file size limits — PDFs of any size are automatically handled.

When you upload a PDF that exceeds Gemini's limits (50MB or 1000 pages), the system automatically:

  1. Detects the oversized PDF
  2. Splits it into optimal chunks (500 pages each)
  3. Uploads all chunks in parallel
  4. Returns chunk metadata for querying
upload_document("/research/massive-2000-page-report.pdf")

→ Returns:
{
  "wasChunked": true,
  "totalPages": 2000,
  "chunks": [
    { "fileName": "files/abc1", "pageStart": 1, "pageEnd": 500 },
    { "fileName": "files/abc2", "pageStart": 501, "pageEnd": 1000 },
    { "fileName": "files/abc3", "pageStart": 1001, "pageEnd": 1500 },
    { "fileName": "files/abc4", "pageStart": 1501, "pageEnd": 2000 }
  ],
  "allFileNames": ["files/abc1", "files/abc2", "files/abc3", "files/abc4"]
}

query_chunked_document — Query All Chunks at Once

For chunked documents, use this tool to query all parts and get an aggregated answer:

query_chunked_document(
  file_names: ["files/abc1", "files/abc2", "files/abc3", "files/abc4"],
  query: "What are the key recommendations in this report?"
)

→ Queries each chunk, then synthesizes a unified answer

When to Use Document API vs NotebookLM

ScenarioUse
Quick one-off document analysisDocument API — fast, no setup
Building a permanent knowledge baseNotebookLM — permanent storage
Analyzing sensitive documentsDocument API — 48h auto-delete
Multi-source research over timeNotebookLM — organized notebooks
CI/CD pipeline document processingDocument API — API-native
Large PDFs (1000+ pages)Document API — auto-chunking

Programmatic Notebook Creation (v1.7.0+)

Create NotebookLM notebooks entirely from code — no manual clicks required.

Most MCP servers can only read from NotebookLM. This one can create notebooks, add sources, and generate audio — all programmatically.

create_notebook — Build Notebooks Instantly

Create a complete notebook with multiple sources in one command:

{
  "name": "Security Research 2025",
  "sources": [
    { "type": "url", "value": "https://owasp.org/Top10" },
    { "type": "file", "value": "/path/to/security-report.pdf" },
    { "type": "text", "value": "Custom analysis notes...", "title": "My Notes" }
  ],
  "description": "OWASP security best practices",
  "topics": ["security", "owasp", "vulnerabilities"]
}

Supported source types:

  • URL — Web pages, documentation, articles
  • File — PDF, DOCX, TXT, and more
  • Text — Raw text, code snippets, notes

batch_create_notebooks — Scale Up

Create up to 10 notebooks in a single operation:

{
  "notebooks": [
    { "name": "React Docs", "sources": [{ "type": "url", "value": "https://react.dev/reference" }] },
    { "name": "Node.js API", "sources": [{ "type": "url", "value": "https://nodejs.org/api/" }] },
    { "name": "TypeScript Handbook", "sources": [{ "type": "url", "value": "https://www.typescriptlang.org/docs/" }] }
  ]
}

Perfect for:

  • Setting up project documentation libraries
  • Onboarding new team members with curated knowledge bases
  • Creating topic-specific research notebooks in bulk

manage_sources — Dynamic Source Management

Add or remove sources from existing notebooks:

{
  "notebook_id": "abc123",
  "action": "add",
  "sources": [{ "type": "url", "value": "https://new-documentation.com" }]
}

generate_audio — Audio Overview Creation

Generate NotebookLM's famous "Audio Overview" podcasts programmatically:

"Generate an audio overview for my Security Research notebook"

sync_notebook — Keep Sources Updated

Sync notebook sources from a local directory:

{
  "notebook_id": "abc123",
  "directory": "/path/to/docs",
  "patterns": ["*.md", "*.pdf"]
}

Why This Matters

Traditional WorkflowWith This MCP
Manually create notebook in browsercreate_notebook → done
Click "Add source" for each documentBatch add in single command
Navigate UI to generate audiogenerate_audio → podcast ready
Update sources by handsync_notebook from local files

Your agent can now build entire knowledge bases autonomously.


📊 Query History & Chat Extraction (v2026.1.0)

Track your research and recover conversations from NotebookLM notebooks.

get_query_history — Review Past Research (v1.10.8)

All queries made through the MCP are automatically logged for review:

"Show me my recent NotebookLM queries"
"Find queries about security from last week"
"What did I ask the fine-tuning notebook?"
  • Automatic logging — every Q&A pair saved with metadata
  • Search — find specific topics across all queries
  • Filter — by notebook, session, or date
  • Quota tracking — see query counts and timing

get_notebook_chat_history — Extract Browser Conversations (v2026.1.0)

Extract conversation history directly from a NotebookLM notebook's chat UI with context management to avoid overwhelming your AI context window:

Quick audit (preview mode):

{ "notebook_id": "my-research", "preview_only": true }

Returns message counts without content — test the water before extracting.

Export to file (avoids context overflow):

{ "notebook_id": "my-research", "output_file": "/tmp/chat-history.json" }

Dumps full history to disk instead of returning to context.

Paginate through history:

{ "notebook_id": "my-research", "limit": 20, "offset": 0 }
{ "notebook_id": "my-research", "limit": 20, "offset": 20 }

Page through large histories without loading everything at once.

Returns:

{
  "notebook_url": "https://notebooklm.google.com/notebook/xxx",
  "notebook_name": "My Research",
  "total_messages": 150,
  "returned_messages": 40,
  "user_messages": 75,
  "assistant_messages": 75,
  "offset": 0,
  "has_more": true,
  "messages": [...]
}

Use cases:

  • Recover conversations made directly in the NotebookLM browser (not tracked by MCP)
  • Audit research — see what queries were made in a notebook
  • Resume context — pick up where a previous session left off
  • Quota reconciliation — understand why quota seems off

Why This Fork?

The original NotebookLM MCP is excellent for productivity — but MCP servers handle sensitive data:

  • Browser sessions with Google authentication
  • Cookies and tokens stored on disk
  • Query history that may contain proprietary information

This fork adds 14 security hardening layers to protect that data.


Security Features

LayerFeatureProtection
🔐Post-Quantum EncryptionML-KEM-768 + ChaCha20-Poly1305 hybrid
🔍Secrets ScanningDetects 30+ credential patterns (AWS, GitHub, Slack...)
📌Certificate PinningBlocks MITM attacks on Google connections
🧹Memory ScrubbingZeros sensitive data after use
📝Audit LoggingTamper-evident logs with hash chains
⏱️Session Timeout8h hard limit + 30m inactivity auto-logout
🎫MCP AuthenticationToken-based auth with brute-force lockout
🛡️Response ValidationDetects prompt injection attempts
Input ValidationURL whitelisting, sanitization
🚦Rate LimitingPer-session request throttling
🙈Log SanitizationCredentials masked in all output
🐍MEDUSA IntegrationAutomated security scanning
🖥️Cross-PlatformNative support for Linux, macOS, Windows

Post-Quantum Ready

Traditional encryption (RSA, ECDH) will be broken by quantum computers. This fork uses hybrid encryption:

ML-KEM-768 (Kyber) + ChaCha20-Poly1305
  • ML-KEM-768: NIST-standardized post-quantum key encapsulation
  • ChaCha20-Poly1305: Modern stream cipher (immune to timing attacks)

Even if one algorithm is broken, the other remains secure.

Cross-Platform Support

Full native support for all major operating systems:

PlatformFile PermissionsData Directory
LinuxUnix chmod (0o600/0o700)~/.local/share/notebooklm-mcp/
macOSUnix chmod (0o600/0o700)~/Library/Application Support/notebooklm-mcp/
WindowsACLs via icacls (current user only)%LOCALAPPDATA%\notebooklm-mcp\

All sensitive files (encryption keys, auth tokens, audit logs) are automatically protected with owner-only permissions on every platform.

Enterprise Compliance (v1.6.0+)

Full compliance support for regulated industries:

RegulationFeatures
GDPRConsent management, DSAR handling, right to erasure, data portability
SOC2 Type IIHash-chained audit logs, incident response, availability monitoring
CSSF7-year retention, SIEM integration, policy documentation

Compliance Tools (16 MCP tools)

compliance_dashboard    - Real-time compliance status
compliance_report       - Generate audit reports (JSON/CSV/HTML)
compliance_evidence     - Collect evidence packages
grant_consent          - Record user consent
submit_dsar            - Handle data subject requests
request_erasure        - Right to be forgotten
export_user_data       - Data portability export
create_incident        - Security incident management
...and 8 more

See COMPLIANCE-SPEC.md for full documentation.


Installation

Claude Code

claude mcp add notebooklm -- npx @pan-sec/notebooklm-mcp@latest

With Authentication + Gemini (Recommended)

claude mcp add notebooklm \
  --env NLMCP_AUTH_ENABLED=true \
  --env NLMCP_AUTH_TOKEN=$(openssl rand -base64 32) \
  --env GEMINI_API_KEY=your-gemini-api-key \
  -- npx @pan-sec/notebooklm-mcp@latest

Codex

codex mcp add notebooklm -- npx @pan-sec/notebooklm-mcp@latest
Cursor

Add to ~/.cursor/mcp.json:

{
  "mcpServers": {
    "notebooklm": {
      "command": "npx",
      "args": ["-y", "@pan-sec/notebooklm-mcp@latest"],
      "env": {
        "NLMCP_AUTH_ENABLED": "true",
        "NLMCP_AUTH_TOKEN": "your-secure-token",
        "GEMINI_API_KEY": "your-gemini-api-key"
      }
    }
  }
}
Other MCP Clients
{
  "mcpServers": {
    "notebooklm": {
      "command": "npx",
      "args": ["-y", "@pan-sec/notebooklm-mcp@latest"],
      "env": {
        "NLMCP_AUTH_ENABLED": "true",
        "NLMCP_AUTH_TOKEN": "your-secure-token",
        "GEMINI_API_KEY": "your-gemini-api-key"
      }
    }
  }
}

Quick Start

1. Install (see above)

2. Authenticate

"Log me in to NotebookLM"

Chrome opens → sign in with Google

3. Add your notebook

Go to notebooklm.google.com → Create notebook → Upload docs → Share link

4. Use it

"Research [topic] using this NotebookLM: [link]"

5. Try Deep Research (NEW!)

"Use deep research to investigate [complex topic]"

Complete Tool Reference

Research Tools

ToolDescriptionBackend
ask_questionQuery your NotebookLM notebooksBrowser
deep_researchComprehensive research with citationsGemini API
gemini_queryFast queries with grounding toolsGemini API
get_research_statusCheck background research progressGemini API

Notebook Management

ToolDescription
add_notebookAdd notebook to library
list_notebooksList all notebooks
get_notebookGet notebook details
update_notebookUpdate notebook metadata
remove_notebookRemove from library
select_notebookSet active notebook
search_notebooksSearch by query

Source Management (v1.7.0+)

ToolDescription
manage_sourcesAdd/remove/list sources
generate_audioCreate Audio Overview
sync_notebookSync sources from local files

Session & System

ToolDescription
list_sessionsView active sessions
close_sessionClose a session
reset_sessionReset session chat
get_healthServer health check (with deep_check for UI verification)
get_query_historyReview past queries with search/filter
get_notebook_chat_historyExtract browser conversations (pagination, file export)
setup_authInitial authentication
re_authRe-authenticate
cleanup_dataDeep cleanup utility
get_library_statsLibrary statistics
get_quotaCheck usage limits and remaining quota

Compliance (v1.6.0+)

16 compliance tools for GDPR, SOC2, and CSSF requirements.


What Gets Protected

DataProtection
Browser cookiesPost-quantum encrypted at rest
Session tokensAuto-expire + memory scrubbing
Query historyAudit logged with tamper detection
Google connectionCertificate pinned (MITM blocked)
Log outputCredentials auto-redacted
API responsesScanned for leaked secrets
Gemini API keySecure memory handling

Configuration

All security features are enabled by default. Override via environment variables:

# Authentication
NLMCP_AUTH_ENABLED=true
NLMCP_AUTH_TOKEN=your-secret-token

# Gemini API (v1.8.0+)
GEMINI_API_KEY=your-api-key
GEMINI_DEFAULT_MODEL=gemini-2.5-flash
GEMINI_DEEP_RESEARCH_ENABLED=true
GEMINI_TIMEOUT_MS=30000

# Encryption
NLMCP_USE_POST_QUANTUM=true
NLMCP_ENCRYPTION_KEY=base64-32-bytes  # Optional custom key

# Session Limits
NLMCP_SESSION_MAX_LIFETIME=28800  # 8 hours
NLMCP_SESSION_INACTIVITY=1800     # 30 minutes

# Secrets Scanning
NLMCP_SECRETS_SCANNING=true
NLMCP_SECRETS_BLOCK=false         # Block on detection
NLMCP_SECRETS_REDACT=true         # Auto-redact

# Certificate Pinning
NLMCP_CERT_PINNING=true

# Audit Logging
NLMCP_AUDIT_ENABLED=true

See SECURITY.md for complete configuration reference.


Security Scanning

Run MEDUSA security scanner:

npm run security-scan

Or integrate in CI/CD:

- name: Security Scan
  run: npx @pan-sec/notebooklm-mcp && npm run security-scan

Comparison

vs Other NotebookLM MCPs

FeatureOthers@pan-sec/notebooklm-mcp
Zero-hallucination Q&A
Library management
Create Notebooks ProgrammaticallyEXCLUSIVE
Batch Create (10 notebooks)EXCLUSIVE
Gemini Deep ResearchEXCLUSIVE
Document API (no browser)EXCLUSIVE
Auto-chunking (1000+ page PDFs)EXCLUSIVE
Chat History ExtractionNEW
Deep Health VerificationNEW
Query History & Search
Quota Management
Source Management (add/remove)
Audio Overview Generation
Sync from Local Directories

Security & Compliance (Unique to This Fork)

FeatureOthers@pan-sec/notebooklm-mcp
Cross-platform (Linux/macOS/Windows)⚠️ Partial✅ Full
Post-quantum encryption✅ ML-KEM-768 + ChaCha20
Secrets scanning✅ 30+ patterns
Certificate pinning✅ Google MITM protection
Memory scrubbing✅ Zero-on-free
Audit logging✅ Hash-chained
MCP authentication✅ Token + lockout
Prompt injection detection✅ Response validation
GDPR Compliance✅ Full
SOC2 Type II✅ Full
CSSF (Luxembourg)✅ Full

Bottom line: If you need more than basic queries, or care about security, there's only one choice.


Version History

VersionHighlights
v2026.1.1🔍 Deep health check — verifies NotebookLM chat UI actually loads
v2026.1.0📊 Chat history extraction with context management, CalVer versioning
v1.10.8Query history logging, quota tracking
v1.10.0Auto-chunking for large PDFs (1000+ pages)
v1.9.0Document API: upload, query, delete via Gemini Files API
v1.8.0Gemini Deep Research, Query with Grounding, Background Tasks
v1.7.0Programmatic notebook creation, batch operations, audio generation
v1.6.0Enterprise compliance: GDPR, SOC2 Type II, CSSF
v1.5.0Cross-platform support (Windows ACLs, macOS, Linux)
v1.4.0Post-quantum encryption, secrets scanning

Reporting Vulnerabilities

Found a security issue? Do not open a public GitHub issue.

Email: support@pantheonsecurity.io


Credits

License

MIT — Same as original.


Security hardened with 🔒 by Pantheon Security

Powered by Google Gemini 🚀

Full Security DocumentationCompliance GuideReport Vulnerability

Reviews

No reviews yet

Sign in to write a review