Overview
The Parallel MCP Servers expose Parallel APIs to AI assistants and large language model (LLM) workflows, delivering high-quality, relevant results from the web while optimizing for the price-performance balance your AI applications need at scale. The following MCP servers can be installed separately in any MCP client. They can also be used programmatically by providing your Parallel API key in the Authorization header as a Bearer token.- Search MCP:
https://search-mcp.parallel.ai/mcp
- Task MCP:
https://task-mcp.parallel.ai/mcp
MCP Server Installation Guide
Cursor
Add to~/.cursor/mcp.json
or .cursor/mcp.json
(project-specific):
VS Code
Add tosettings.json
in VS Code:
Claude Desktop / Claude.ai
Go to Settings → Connectors → Add Custom Connector, and fill in:Claude Code
Run this command in your terminal:Windsurf
Add to your Windsurf MCP configuration:Cline
Go to the MCP Servers section → Remote Servers → Edit Configuration:Gemini CLI
Add to~/.gemini/settings.json
: