Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.trynebula.ai/llms.txt

Use this file to discover all available pages before exploring further.

Connect Claude Desktop, Cursor, VS Code, and other MCP-compatible clients to Nebula. Your AI assistant can store and search memories directly using our web-hosted MCP server.

Quick Setup

The easiest way to connect is through your collection’s Connect dialog at trynebula.ai:
  1. Navigate to your collection
  2. Click the “Connect” button
  3. Select the “MCP” tab
  4. Choose your AI client (Cursor, VS Code, Claude Desktop, etc.)
  5. Generate an API key automatically
  6. Use the one-click deeplink or copy the configuration
This automatically generates a dedicated API key for your MCP client and provides ready-to-use configuration.

Configuration

Get your API key from trynebula.ai → Settings → API Keys, or generate one automatically through the collection’s Connect dialog.
Cursor supports HTTP-based MCP servers with native transport.Configuration File: .cursor/mcp.json
{
  "mcpServers": {
    "nebula-memory": {
      "url": "https://mcp.trynebula.ai/mcp",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY",
        "X-Collection-ID": "YOUR_COLLECTION_ID"
      }
    }
  }
}
View Cursor MCP docs

Available Tools

Once connected, your AI assistant gets these memory operations:

add_memory

Store documents or conversation messages. Parameters:
  • content (required): Text to store
  • role (optional): For conversations - user or assistant
  • metadata (optional): Custom metadata object
Returns:
  • Shared fields: success, message, and optional task_id
  • Document write: engram_id and status
  • Role-based conversation write: conversation_id and message_ids
Example:
add_memory(content="User prefers dark mode", metadata={"user_id": "123"})

# Conversation/message write
add_memory(content="Remember this preference", role="user")

search_memories

Find and retrieve relevant memories semantically. Parameters:
  • query (required): Search text
  • effort (optional): Search effort level - auto/low/medium/high (defaults to auto if omitted)
Example:
search_memories(query="user preferences")
# Or explicitly set effort:
search_memories(query="user preferences", effort="medium")

Troubleshooting

Tools not appearing:
  • Restart your AI assistant after config changes
  • Verify API key is valid at trynebula.ai → Settings → API Keys
  • Check that the collection ID is correct
  • Ensure the configuration file is in the correct location for your client
Connection errors:
  • Verify your API key has the correct permissions
  • Check that the collection ID exists in your account
  • Ensure your firewall/proxy allows connections to mcp.trynebula.ai
  • Check the client’s console/logs for detailed error messages
Authentication issues:
  • Make sure the Authorization header includes “Bearer ” prefix
  • Verify the X-Collection-ID header matches your collection
  • Try generating a new API key through the Connect dialog

Web-Hosted vs Local Installation

All supported clients can connect to Nebula’s web-hosted MCP server: Direct HTTP Support:
  • Cursor - Native HTTP transport
  • VS Code - Native HTTP transport
  • Windsurf - Native HTTP transport
  • Claude Code - Native HTTP transport
Via mcp-remote Bridge:
  • Claude Desktop - Uses mcp-remote to bridge HTTP to stdio
Benefits:
  • No local installation or dependencies required
  • Always up-to-date with the latest features
  • Simple configuration with just URL and headers
  • Automatic API key generation per client
  • Works seamlessly across all clients

Local Installation (Optional)

If you prefer running an MCP server locally, use @nebula-ai/sdk-mcp. It is generated by Stainless from our public OpenAPI spec, so it covers the full Nebula API rather than the curated add_memory / search_memories tools the hosted server exposes. Direct invocation:
export NEBULA_API_KEY="your_api_key_here"
npx -y @nebula-ai/sdk-mcp@latest
Example client config (Cursor, Claude Desktop, Claude Code via stdio):
{
  "mcpServers": {
    "nebula": {
      "command": "npx",
      "args": ["-y", "@nebula-ai/sdk-mcp"],
      "env": {
        "NEBULA_API_KEY": "your_api_key_here"
      }
    }
  }
}
VS Code uses servers instead of mcpServers. To run as a remote server, launch with --transport=http --port=3000 and authenticate with Authorization: Bearer … or X-API-Key.
The local server uses MCP “Code Mode”: instead of one tool per endpoint, it exposes a search_docs tool and an execute tool that runs TypeScript against a pre-authenticated @nebula-ai/sdk client in a sandbox. Pass the target collection_id as an argument inside each execute call.
Use this when you want:
  • Full API coverage (every endpoint Stainless generates an SDK method for)
  • Local execution / offline development
  • A self-hosted MCP under your own auth and network policies