The Context Problem
For Developers:
You're implementing a new API endpoint in Cursor. You need to check how similar endpoints were built, reference the internal API style guide, review past code review comments, and find that Slack discussion about authentication decisions.
Currently? It's a tab-switching nightmare. Or worse, you're copying and pasting context into your AI tool, piece by piece, hoping you didn't miss something important.
For Knowledge Workers:
You're researching competitors in Claude Desktop. You have market research PDFs, email threads with the sales team, Slack discussions with product, and Notion docs outlining strategy.
Currently? The AI can't access any of this. You're manually feeding it context, losing the thread, and wondering if there's a better way.
The Core Problem: Your AI tools are incredibly powerful, but they're context-blind to your organizational knowledge.
The Solution: The Model Context Protocol (MCP) changes this.
What is MCP?
"MCP is like USB-C for AI" (credit to Anthropic)
Just as USB-C provides a standardized way to connect your devices to peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.
The Problem It Solves
Before MCP, each AI client had its own approach to connecting with external data and tools. Some, like OpenAI, started building proprietary plugin architectures. Others implemented custom file upload mechanisms or one-off integrations.
Without a standard, this meant:
- Fragmented ecosystem — Each tool invented its own integration model
- Duplicated effort — Developers rebuilt the same integrations for every client
- Limited compatibility — Integrations built for Claude didn't work in Cursor
- Vendor lock-in — Switching AI tools meant rebuilding all your connections
- Slow adoption — Building n×m integrations (every tool × every data source) doesn't scale
MCP provides what Photoshop plugins did for image editing: a standard that allows an entire ecosystem to flourish. Build an integration once, use it everywhere.
How MCP Works
MCP introduces a standard protocol with three components:
- MCP Servers — Expose data sources and tools (like your knowledge base, GitHub repos, or databases)
- MCP Clients — Consume them (like Cursor, Claude Desktop, Goose)
- Standard Protocol — Enables any client to talk to any server
Benefits:
- Write once, use everywhere — One integration works in all MCP clients
- Composable AI stack — Mix and match the best tools for your needs
- No vendor lock-in — Switch AI tools without rebuilding integrations
- Open ecosystem — Community builds and shares MCP servers
MCP in the Wild
The ecosystem is growing rapidly:
- Sentry — Error monitoring and debugging (early MCP leader)
- GitHub — Access repositories and issues
- Slack — Read and post messages
- PostgreSQL — Query databases
- Filesystem — Read and write local files
- Exa — AI-powered code search across public GitHub repos
- Context7 (Upstash) — Search code documentation and best practices
- Browser — Interact with web pages
- Zine — Search your organizational knowledge
Zine's Unique Architecture: Server + Client
Here's where Zine stands out. Most products are either an MCP server or an MCP client. Zine is both.
Zine's MCP server is exposed as a remote Streamable HTTP endpoint at https://www.zine.ai/mcp
, allowing external AI tools to access your knowledge. Internally, Zine also acts as an MCP client, connecting to both its built-in server and external MCP servers to power its chat interface.
Zine as MCP Server: Your Knowledge, Everywhere
Zine exposes your entire knowledge graph through MCP, making it accessible to any AI tool:
What's Available:
- 30+ data sources ingested (email, Slack, GitHub, Notion, SharePoint, and more)
- Vector search across everything
- Documents converted to Markdown
- Audio/video automatically transcribed
- Images automatically analyzed
- Always up-to-date
Works With:
- Cursor (coding)
- Claude Desktop (research)
- Goose (CLI automation)
- Cline (task execution)
- Windsurf (development)
- Any future MCP-compatible tool
Developer Example: Context-Aware Coding in Cursor
You: In Cursor, "How do we handle rate limiting?"
Cursor: [Uses Zine MCP server to search your docs]
Cursor: "Based on the engineering wiki and 3 past PRs,
your team uses a token bucket algorithm with Redis.
Here's code following your established patterns..."
[Generates code that matches your team's standards]
The AI didn't just generate generic rate limiting code—it referenced your actual documentation, past implementations, and team decisions.
Knowledge Worker Example: Research in Claude Desktop
You: In Claude Desktop, "Summarize all Q4 launch emails"
Claude: [Uses Zine MCP server to search emails]
Claude: "Here's a summary of 15 emails about Q4 launch:
- Timeline moved to Nov 15 (confirmed by Sarah)
- Budget approved at $150k (email from Finance)
- Beta partners: TechCorp, StartupXYZ (from partnerships)
..."
The AI accessed your actual emails, not generic information. Everything is grounded in your organizational context.
Quick Setup
One-time configuration in your AI tool:
// claude_desktop_config.json or cursor settings
{
"mcpServers": {
"zine": {
"url": "https://www.zine.ai/mcp",
"transport": "streamable-http",
"headers": {
"Authorization": "Bearer your-zine-api-token"
}
}
}
}
Get your API token from Zine Settings after signing up at www.zine.ai
Zine as MCP Client: Bring Other Tools In
Zine isn't just an MCP server—it's also an MCP client. This means you can connect Zine to other MCP servers:
- Linear (project management)
- GitHub (code repositories)
- Slack (real-time communication)
- Notion (documentation)
- Any MCP-compatible server
Example Workflow
You: In Zine, "Research JWT authentication best practices
and create implementation plan"
Zine workflow:
1. [Searches internal docs via built-in server]
2. [Uses Exa MCP to find public code examples]
3. [Uses Context7 to get official JWT docs]
4. [Synthesizes internal + external knowledge]
5. [Creates GitHub issue with implementation plan]
Result: Issue #247 created with comprehensive research
Why Both Directions Matter
Server: Your knowledge works in ANY AI tool
- Use Cursor for coding with your team's context
- Use Claude Desktop for research with your data
- Use Goose for automation with your knowledge
- Future tools work automatically
Client: Other tools work in Zine's unified interface
- Search your knowledge (Zine)
- Create issues (Linear/GitHub)
- Send messages (Slack)
- Update docs (Notion)
- All in one place
Result: A composable AI stack with no vendor lock-in.
Use Case Deep Dive: AI-Assisted Coding
The Developer Context Problem
Modern AI coding tools (Cursor, Claude Code, Cline, Windsurf) are incredibly powerful, but they face a context problem:
- Limited Context Window — Can only fit so much code
- No Organizational Memory — Don't know your patterns, decisions, or discussions
- Isolated from Knowledge — Can't access docs, Slack, past issues
When you're coding, you need more than just the files in front of you. You need:
- Technical documentation (wikis, API specs, ADRs)
- Historical context (past PRs, code reviews, design docs)
- Team knowledge (coding standards, patterns, debugging guides)
- External code examples (GitHub search, documentation)
- Best practices (from docs, Stack Overflow, official guides)
How Zine Solves This
When you connect Zine as an MCP server to your coding tool, the AI gets access to your complete organizational context:
Technical Documentation:
- Internal wikis and documentation sites
- API specifications (OpenAPI, GraphQL schemas)
- Architecture Decision Records (ADRs)
- Setup guides and runbooks
Historical Context:
- Past pull requests and code reviews
- GitHub issue discussions and resolutions
- Design documents from Notion/Confluence
- Slack conversations about technical decisions
Team Knowledge:
- Coding standards and style guides
- Common patterns and anti-patterns
- Debugging and troubleshooting guides
- Onboarding documentation
Real Workflow Examples
Example 1: Implementing a Feature in Cursor
Developer Context:
- Working on new authentication flow
- Needs to match existing patterns
- Wants to avoid past mistakes
In Cursor:
"How did we implement OAuth in the API service?"
Cursor + Zine MCP Server:
- Searches Zine knowledge base
- Finds:
- Original OAuth implementation PR (#342)
- Security review comments from @security-team
- Slack discussion about token refresh strategy
- Updated architecture doc about session management
- Generates code following established patterns
- References specific files and decisions in comments
Result: New code that matches team standards and avoids past pitfalls. The AI didn't just generate working code—it generated code that fits your codebase.
Example 2: Debugging with Cline
Developer Context:
- Investigating production bug
- Similar issue happened 6 months ago
- Solution was documented in Slack
In Cline:
"Help me debug this rate limiting error:
RateLimitExceeded: 429 Too Many Requests"
Cline + Zine MCP Server:
- Searches past Slack threads about rate limiting
- Finds previous incident and resolution
- Locates related monitoring dashboard (from runbook)
- Retrieves debugging guide from internal wiki
- Suggests fix based on past solution
Result: Faster resolution using institutional knowledge. Instead of reinventing the solution, you learned from past experience.
Example 3: Code Review Context in Goose
Developer Context:
- Reviewing a PR for security and performance
- Wants to check against team standards
- Needs to reference related decisions
In Goose CLI:
goose session start
> "Review this PR for security and performance"
Goose + Zine MCP Server:
- Retrieves security checklist from docs
- Finds performance guidelines (wiki)
- References past similar PRs
- Checks against relevant ADRs
- Generates detailed review with links
Result: Thorough review backed by team knowledge. Your code reviews are consistent with established standards.
Technical Architecture for Developers
When you use Zine's MCP server from a coding tool, you get access to:
// Available MCP Tools (6)
retrieveContext() // Vector search across your knowledge base
retrieveMemories() // Search previously stored memories
ingestMemory() // Store short-term notes/facts
deleteMemory() // Remove a memory by ID
searchWeb() // Web search via Exa
searchCode() // Code/docs search via Exa Code
// Available MCP Resources (1 type)
contents://{id} // Read individual content items with full Markdown text
Smart Features:
- Vector Search — Semantic search finds relevant content even with different wording
- Markdown Conversion — PDFs and docs become readable, structured text
- Automatic Transcription — Video/audio meetings become searchable text
- Real-time Updates — Slack messages and emails indexed as they arrive
Privacy & Security:
- Self-hosted knowledge graph (Graphlit platform)
- No AI training on your data
- OAuth authentication for enterprise sources
- Encrypted at rest and in transit
- Granular access controls
Use Case Deep Dive: Knowledge Work
The Knowledge Worker Context Problem
If you're using Claude Desktop, ChatGPT, or similar tools for research, analysis, report writing, meeting preparation, or competitive intelligence—you're constantly copying information from multiple sources into the chat.
The AI has no memory of your organizational context. Every conversation starts from scratch.
How Zine Solves This
Research Workflows
Example 1: Competitive Analysis
You: In Claude Desktop
"Analyze our competitive positioning vs. Acme Corp"
Claude + Zine MCP Server:
1. Searches past sales calls (transcribed, ingested)
2. Finds market research reports (PDFs converted to Markdown)
3. Retrieves competitive analysis docs (Notion)
4. Pulls relevant email threads
5. Synthesizes comprehensive analysis
Result: "Based on 8 sales calls, 3 research reports, and 12 email
threads, here's how you compare to Acme Corp across pricing,
features, and market positioning..."
Deep analysis from scattered sources, all grounded in your actual data.
Example 2: Customer Success
You: In Claude Desktop
"Prepare briefing for TechCorp renewal call"
Claude + Zine MCP Server:
1. Finds all TechCorp emails from past year
2. Retrieves Slack discussions with CS team
3. Gets past meeting transcripts
4. Pulls support tickets from Linear (if connected)
5. Creates comprehensive briefing
Result: Complete account history in seconds, not hours of searching.
Example 3: Content Creation
You: In Claude Desktop
"Write blog post about our approach to AI safety"
Claude + Zine MCP Server:
1. Searches internal AI safety documentation
2. Finds relevant team discussions (Slack, email)
3. Retrieves past blog posts for voice/style
4. Gets quotes from leadership interviews (transcribed)
5. Drafts post matching company voice and using real quotes
Result: On-brand content backed by real sources and company positions.
Meeting Prep & Follow-up
Before Meetings:
- "Summarize all background on Project Phoenix"
- "What were the action items from last quarter's board meeting?"
- "Find past discussions about pricing strategy"
After Meetings:
- "Create summary and action items from today's transcript"
- "Compare today's decisions with past quarterly reviews"
- "Draft follow-up email with relevant docs attached"
Cross-Functional Workflows
Example: Product Launch
A marketing manager needs to:
- Understand technical features (engineering docs)
- Review competitive landscape (research reports)
- Check pricing strategy (executive emails)
- See customer feedback (Slack, support tickets)
With Zine MCP Server:
All of this is accessible in Claude Desktop via natural language queries. No manual searching across tools. The AI synthesizes insights from your complete context.
"Create product launch messaging that addresses competitive
differentiation and customer pain points"
Claude: [Accesses engineering specs, competitor research,
customer feedback threads, and past launch docs]
Result: Messaging that's technically accurate, competitively
positioned, and addresses real customer needs.
Technical Deep Dive: How It Works
Zine's Internal MCP Server Architecture
Built on the Graphlit knowledge platform:
// Remote MCP Server (www.zine.ai/mcp)
interface ZineRemoteMCPServer {
// 6 MCP Tools
tools: [
'retrieveContext', // Vector search across knowledge base
'retrieveMemories', // Search stored memories
'ingestMemory', // Store short-term notes
'deleteMemory', // Delete memory by ID
'searchWeb', // Web search via Exa
'searchCode', // Code/docs search via Exa Code
]
// 1 MCP Resource type
resources: [
'contents://{id}', // Individual content with Markdown text
]
}
// Note: Zine is built on Graphlit, which supports 30+ data source types
// and has a more extensive MCP server (graphlit-mcp-server) with 34 tools
// for local stdio usage. Zine's remote server is simplified for security.
Content Processing Pipeline
- Ingestion: Connect data sources (OAuth, API keys, RSS)
- Extraction:
- PDFs → Markdown (text extraction with layout)
- Videos/Audio → Transcripts (automatic transcription)
- Web pages → Clean text (remove ads, navigation)
- Office docs → Structured text
- Vectorization: Semantic embeddings for search
- Indexing: Real-time search index (full-text + vector)
- MCP Exposure: Available to all clients via standard protocol
Zine as MCP Client Architecture
When connecting external MCP servers:
// User can add any MCP server
interface MCPServerConfig {
name: string;
type: 'builtin' | 'oauth' | 'api-key';
transport: 'sse' | 'stdio';
credentials?: OAuthTokens | ApiKey;
}
const connectedServers = [
{ name: 'graphlit', type: 'builtin' }, // Zine's internal
{ name: 'linear', type: 'oauth' }, // External with OAuth
{ name: 'github', type: 'oauth' }, // External with OAuth
{ name: 'slack', type: 'oauth' }, // External with OAuth
{ name: 'filesystem', type: 'api-key' }, // Local files
// User adds more as needed
];
Security Features:
- OAuth 2.0 for enterprise integrations
- Secure token storage (encrypted, Redis)
- Automatic token refresh (background job)
- Per-tool permission controls (approve actions)
- Audit logging for all operations
Transport Support:
- SSE (Server-Sent Events): Web-based connections
- stdio: CLI tools like Goose
- Multi-transport proxy (connect anywhere)
MCP Server Types & Transports
Understanding the different types of MCP servers and how they connect is important when building your AI stack.
Local vs Remote MCP Servers
Local MCP Servers (stdio)
These run as local processes on your machine:
// Runs via npx - spins up a local Node.js process
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/Users/you/Documents"]
}
}
}
Characteristics:
- Process-based: Each server runs as a separate process
- stdio communication: Client and server communicate via standard input/output
- Local access: Typically for local resources (files, databases on your machine)
- Simple auth: Often no authentication needed (runs with your permissions)
- Use cases: Filesystem access, local databases, system tools
Remote MCP Servers (HTTP-based)
These run as web services you connect to over HTTP:
// Connects to remote server via HTTP
{
"mcpServers": {
"zine-remote": {
"url": "https://api.zine.ai/mcp",
"headers": {
"Authorization": "Bearer your-api-token"
}
}
}
}
Characteristics:
- HTTP-based: Connect to remote servers over the web
- Persistent services: Server runs independently, handles multiple clients
- Cloud resources: Access cloud data, team knowledge, external APIs
- Authentication required: OAuth, bearer tokens, API keys
- Use cases: Team knowledge bases, cloud services, shared tools
Transport Evolution: SSE → Streamable HTTP
The MCP ecosystem is evolving its transport layer:
SSE (Server-Sent Events) - Being Deprecated
Early remote MCP implementations used SSE:
- One-way server → client streaming
- Built on HTTP
- Limited browser support
- Connection management challenges
Streamable HTTP - New Preferred Method
The MCP specification now recommends Streamable HTTP:
- Bidirectional streaming over HTTP
- Better support for request/response patterns
- Improved error handling
- Works with modern HTTP/2 and HTTP/3
- Better firewall and proxy compatibility
// Modern Streamable HTTP approach
const mcpClient = new MCPClient({
transport: 'streamable-http',
url: 'https://api.zine.ai/mcp',
auth: {
type: 'bearer',
token: 'your-token'
}
});
Why It Matters:
If you're building MCP servers, prefer Streamable HTTP for remote access. If you're using existing servers, check their documentation—many are migrating from SSE to Streamable HTTP.
Zine's MCP server supports both approaches for backward compatibility, but we recommend Streamable HTTP for new integrations.
Authentication Methods
MCP servers use various authentication approaches depending on their architecture:
1. OAuth 2.1 (Recommended for Production)
Modern, secure, user-delegated authentication:
// OAuth 2.1 flow
const server = {
name: 'github',
transport: 'streamable-http',
url: 'https://mcp.github.com',
auth: {
type: 'oauth2.1',
clientId: 'your-client-id',
scopes: ['repo', 'issues'],
// Token automatically refreshed
}
}
Benefits:
- User delegates access (doesn't share password)
- Automatic token refresh
- Granular scopes (least privilege)
- Revocable (user can revoke access)
- Industry standard security
Use Cases:
- Enterprise integrations (GitHub, Linear, Slack)
- Team knowledge bases
- Cloud services
In Zine: We use OAuth 2.1 for external MCP servers, with automatic token refresh and encrypted storage.
2. Bearer Tokens / API Keys (Common Today)
Simple token-based authentication:
// Bearer token approach
const server = {
name: 'api-service',
transport: 'streamable-http',
url: 'https://api.example.com/mcp',
auth: {
type: 'bearer',
token: 'your-long-api-token-here'
}
}
Characteristics:
- Simple to implement
- No OAuth flow needed
- Token is long-lived
- No automatic refresh
- Manual rotation required
Use Cases:
- Internal tools
- Personal API keys
- Development/testing
Security Note: Store tokens securely (encrypted, environment variables). Never commit to version control.
3. Custom Headers (Flexible)
Some services use custom authentication headers:
// Custom header authentication
const server = {
name: 'custom-service',
transport: 'streamable-http',
url: 'https://api.custom.com/mcp',
headers: {
'X-API-Key': 'your-api-key',
'X-Client-ID': 'your-client-id',
'X-Custom-Header': 'custom-value'
}
}
Use Cases:
- Legacy systems
- Custom enterprise integrations
- Services with non-standard auth
4. No Authentication (Local Only)
Local stdio servers typically need no auth:
// Local - runs with your OS permissions
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "./"]
}
}
}
The process runs with your user permissions, so no separate authentication is needed.
Zine's Approach
As an MCP Server:
Zine exposes your knowledge as a remote Streamable HTTP MCP server:
{
"mcpServers": {
"zine": {
"url": "https://www.zine.ai/mcp",
"transport": "streamable-http",
"headers": {
"Authorization": "Bearer your-zine-api-token"
}
}
}
}
Benefits of Remote Server:
- No local process to manage
- Works from any machine
- Team members can share same knowledge base
- Always up-to-date (server-side)
- Scales automatically
Note: If you're a Graphlit platform user (not Zine), there's also graphlit-mcp-server
(local stdio) for direct platform access. This is separate from Zine's MCP server.
As an MCP Client:
Zine supports connecting to:
- Local stdio servers (filesystem, local databases)
- Remote HTTP servers (both SSE and Streamable HTTP)
- OAuth 2.1 authentication (GitHub, Linear, Slack)
- Bearer token auth (API services)
- Custom headers (enterprise integrations)
Security Features:
- Encrypted token storage (Redis, AES-256)
- Automatic OAuth token refresh
- Per-tool permission controls
- Audit logging for all actions
The Composable AI Stack
Key Message: Don't get locked into one tool. Build your own AI stack.
The Vision: Mix and Match MCP Servers
Your Personal AI Workspace
Knowledge Layer (Zine)
↓ MCP protocol
+ Development (GitHub, GitLab)
↓ MCP protocol
+ Project Management (Linear, Jira)
↓ MCP protocol
+ Communication (Slack, Teams)
↓ MCP protocol
+ Documentation (Notion, Confluence)
↓ MCP protocol
= Complete AI-powered workflow
Use ANY AI Client
- Cursor (coding)
- Claude Desktop (research)
- Goose (CLI automation)
- Cline (task execution)
- Windsurf (development)
- Future tools (protocol is open)
Your integrations come with you.
Why This Matters
No Vendor Lock-in
- Switch AI clients anytime
- Protocol is open standard (Anthropic)
- Your integrations portable across tools
Best-of-Breed Stack
- Use best tool for each job
- Not forced into one vendor's suite
- Compose your ideal workflow
Future-Proof
- New AI tools support MCP automatically
- New data sources = new MCP servers
- Community ecosystem effect
Open Ecosystem
- Anyone can build MCP servers
- Open source implementations
- Share servers with community
Example Stacks
For Developers:
Zine (internal knowledge)
+ GitHub (your repos)
+ Exa (public code search)
+ Context7 (documentation search)
+ Linear (project management)
→ Used in Cursor for comprehensive coding context
For Researchers:
Zine (research & knowledge)
+ Notion (notes)
+ Google Drive (files)
+ Exa (web research)
→ Used in Claude Desktop for analysis
For Teams:
Zine (team knowledge base)
+ Slack (communications)
+ Jira (project tracking)
+ GitHub (code repositories)
→ Used across multiple tools by entire team
Getting Started
Path 1: Use Zine's MCP Server (For Developers)
Connect Zine to your coding tool:
Configuration
For Cursor, Claude Desktop, Windsurf:
Add to your MCP settings file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"zine": {
"url": "https://www.zine.ai/mcp",
"transport": "streamable-http",
"headers": {
"Authorization": "Bearer your-zine-api-token"
}
}
}
}
For Goose CLI:
Add Zine as a remote MCP server in your Goose configuration with the same URL and authorization header.
Setup Steps
- Sign up for Zine at www.zine.ai (free tier available)
- Ingest your knowledge sources (Gmail, Slack, Notion, GitHub, etc.)
- Get your API token from Zine Settings → API Access
- Add to your AI tool's MCP configuration (above)
- Start querying your knowledge in your AI tool
Example Queries
In your AI tool, try:
- "Find our API authentication documentation"
- "How did we handle this error in past PRs?"
- "Summarize Slack discussions about the new feature"
- "What does our style guide say about error handling?"
Path 2: Connect MCP Servers to Zine (For Knowledge Workers)
Use Zine as your unified AI workspace:
- Sign up for Zine at www.zine.ai
- Connect data sources (Gmail, Slack, Notion, etc.)
- Add MCP servers in Settings → MCP Servers
- Linear (project management)
- GitHub (code repositories)
- Slack (if not using built-in connector)
- Other OAuth-enabled servers
- Authenticate with OAuth (automatic token refresh)
- Start using combined tools in Zine's interface
What You Can Do
- Chat with your knowledge using Zine's AI interface
- Execute actions across connected tools
- Build custom workflows combining multiple servers
- Research and analyze with full organizational context
Example Commands in Zine
- "Find all docs about our API architecture"
- "Create Linear issue from this research"
- "Summarize last week's Slack discussions about launch"
- "Search GitHub for similar implementations of auth"
What Makes Zine Different
MCP Servers in the Wild
The MCP ecosystem is growing rapidly with many excellent servers:
Development Tools:
- Filesystem — Local file access
- PostgreSQL — Database queries
- GitHub — Code repositories and issues
- Exa — AI-powered code search across public GitHub repos
- Context7 (Upstash) — Search code documentation and best practices
Productivity & Communication:
- Slack — Team communication
- Linear — Project management
- Notion — Documentation and notes
- Browser — Web automation and scraping
Zine's Unique Position
Zine is built on the Graphlit knowledge platform, leveraging its APIs for content ingestion, vector embeddings, and semantic search at scale.
1. Knowledge-First Design
Built specifically for ingesting and searching organizational knowledge:
- Not just data access, but understanding
- Semantic search, not just keyword matching
- Connections between disparate content
2. Multi-Source Aggregation
30+ sources in one unified knowledge graph:
- Email, chat, documents, code, project management
- Unified search across everything
- Cross-reference between sources
3. Bidirectional Integration
Unique dual role:
- Act as server (expose knowledge to any AI tool)
- Act as client (use other MCP tools in Zine)
- Compose complete workflows across tools
4. Production-Ready
Built for teams and enterprise use:
- OAuth authentication
- Automatic token refresh
- Secure credential storage
- Permission controls
- Audit logging
5. Developer-Focused
Optimized for coding use cases:
- Rich content exposure (not just metadata)
- Markdown conversion of documents
- Automatic transcription of media
- Code-aware search and retrieval
Your Knowledge, Everywhere
The Core Problem
Your knowledge is scattered across dozens of tools. Your AI assistants are isolated from this context. The result? You're stuck copying and pasting, context-switching, and wondering why your powerful AI tools feel so disconnected from your actual work.
The MCP Solution
The Model Context Protocol provides an open, standardized way to connect AI tools to your data sources and tools. Write an integration once, use it everywhere.
Zine's Approach
As an MCP Server: Your organizational knowledge becomes available in ANY AI tool
- Use Cursor for coding with your team's context
- Use Claude Desktop for research with your data
- Use Goose for automation with your knowledge
- Future tools work automatically
As an MCP Client: Other tools become available in Zine's workspace
- Combine knowledge search with action (Linear, GitHub, Slack)
- Unified interface for research and execution
- Build workflows across multiple tools
Result: Knowledge that actually works for you, wherever you work.
Get Started Today
For Developers
Connect Zine's MCP server to your AI coding tool using the manual configuration shown above (see "Configuration" section).
For Knowledge Workers
Sign up for Zine and start connecting your knowledge:
Visit: www.zine.ai
Learn More
- Sign Up: www.zine.ai - Start connecting your knowledge
- MCP Specification: modelcontextprotocol.io - Learn about the protocol
- Community: Join our Discord to share workflows
- For Graphlit Users: Graphlit MCP Server - Local stdio server for Graphlit platform
The Model Context Protocol is an open standard from Anthropic. Zine's MCP server is available at www.zine.ai/mcp as a production-ready Streamable HTTP endpoint.