Flowtion Intelligence MCP Server
A Claude-native Model Context Protocol (MCP) plugin that monitors 12 Australian SMB intelligence sources, analyzes signals for content relevance, and provides structured output for downstream content strategy agents.
🔗 Repository: https://github.com/lovishdhillon21-design/flowtion-mcp
Project Structure
flowtion-mcp/
├── src/
│ └── flowtion/
│ ├── __init__.py # Package initialization
│ └── server.py # Main MCP server (10 tools, 1000+ lines)
├── scripts/
│ ├── run_daily.py # Daily ingestion from 6 sources
│ ├── test_weekly.py # Weekly ingestion test (all 12 sources)
│ ├── analyze_and_digest.py # Signal analysis & Slack integration
│ └── send_digest.py # Standalone digest sender
├── config/
│ ├── .env.example # Template for API credentials
│ └── .env.local # (gitignored) Your actual secrets
├── output/
│ ├── weekly_fetch_result.json # (gitignored) Weekly ingestion output
│ └── weekly_digest.txt # (gitignored) Generated digest
├── docs/
│ └── README.md # Full documentation
├── requirements.txt # Python dependencies
├── .gitignore # Excludes secrets & test outputs
└── README.md # This file
Quick Start
1. Install Dependencies
pip install -r requirements.txt
2. Configure API Credentials
Copy the example config:
cp config/.env.example config/.env.local
Edit config/.env.local with your API keys:
- FIRECRAWL_API_KEY — https://www.firecrawl.dev
- REDDIT_CLIENT_ID / CLIENT_SECRET — https://www.reddit.com/prefs/apps
- SLACK_BOT_TOKEN — https://api.slack.com/apps
3. Run Daily Ingestion
python scripts/run_daily.py
Fetches from 6 high-frequency sources (SmartCompany, The Rundown AI, Reddit, etc.) and returns top content opportunities.
Scripts
| Script | Purpose | Run From |
|---|---|---|
scripts/run_daily.py | Fetch 6 daily sources, grade signals, show recommendations | Root directory |
scripts/test_weekly.py | Fetch all 12 sources, save to output/ | Root directory |
scripts/analyze_and_digest.py | Analyze results, archive signals, send Slack alert | Root directory |
scripts/send_digest.py | Send saved digest to Slack | Root directory |
Example workflow:
# Run daily
python scripts/run_daily.py
# Or weekly:
python scripts/test_weekly.py
python scripts/analyze_and_digest.py
12 Monitored Sources
Government & Policy (Authority 25-29)
- S01: DISR / NAIC / CSIRO (29/30)
- S04: Productivity Commission (27/30)
- S09: Deloitte Access Economics AU (27/30)
- S10: ASBFEO (25/30)
SMB-Focused (Authority 24-28)
- S02: MYOB (28/30)
- S03: SmartCompany (25/30)
- S06: QLD AI Hub (24/30)
AI & Automation (Authority 21-23)
- S07: The Rundown AI (22/30)
- S08: n8n Blog (23/30)
- S12: Zapier Blog (21/30)
Community & Content
- S05: Reddit (r/AusFinance, r/australia) (24/30)
- S11: LinkedIn Creators (18/30)
Signal Grades
- Platinum (P) — AU SMB data, regulatory change, breakthrough tools → Act immediately
- Gold (G) — Strong AU relevance or high SMB utility → Content within 48hrs
- Silver (S) — Useful background, supports themes → Reference/archive
- Bronze (B) — Tangentially relevant → File for reference
- Noise (N) — Not relevant → Discard
Automatic Platinum Triggers
- DISR Tracker quarterly data
- Productivity Commission new report
- Deloitte new AU SMB research
- MYOB Business Monitor / AI product
- New government AI grant programs
Full Documentation
See docs/README.md for:
- Complete architecture & design
- All 10 MCP tools
- Signal envelope schema
- Grading rules & examples
- Integration patterns
Using as MCP Plugin
To use in Claude / other MCP clients:
{
"mcpServers": {
"flowtion": {
"command": "python",
"args": ["src/flowtion/server.py"],
"env": {
"FIRECRAWL_API_KEY": "...",
"REDDIT_CLIENT_ID": "...",
"REDDIT_CLIENT_SECRET": "...",
"SLACK_BOT_TOKEN": "..."
}
}
}
}
Development
Python Version
3.8+
Key Dependencies
httpx— async HTTP requestsfeedparser— RSS parsingpydantic— input validationmcp— Model Context Protocol
Running Tests
python scripts/test_weekly.py
python scripts/analyze_and_digest.py
License
MIT
Built by: Lovish Dhillon For: Australian SMB content strategy & market intelligence