- Create Deep Research Task - Initiates a deep research task, returns details to view progress
- Create Task Group - Initiates a task group to enrich multiple items in parallel.
- Get Result - Retrieves the results of both deep research as well as task groups in an LLM friendly format.
- Choose a data source - See Enrichment data sources and destinations.
- Initiate your tasks - After you have your initial data, the MCP can initiate deep research or task groups. See Use cases for inspiration.
- Analyze the results - The LLM provides a link to view progress as results come in. After completion, prompt the LLM to analyze the results and answer your questions.
Enrichment data sources and destinations
The task group tool can be used directly from LLM memory, but is often combined with a data source. The following data sources work well with the Task Group tool:- Upload tabular files - Use the Task MCP with Excel sheets or CSVs. Some LLM clients (such as ChatGPT) allow uploading Excel or CSV files. Availability varies by client.
- Connect with databases - Several MCPs allow your LLM to retrieve data from your database, such as Supabase MCP and Neon MCP.
- Connect with documents - Documents may contain vital initial information to start a task group. See Notion MCP and Linear MCP.
- Connect with web search data - Parallel Search MCP or other web tools can provide an initial list of items, which is often a great starting point for a task group.
Use cases
The Task MCP serves two main purposes. First, it makes Parallel APIs accessible to anyone requiring reliable research or enrichment without coding skills. Second, it’s a great tool for developers to experiment with different use cases and see output quality before writing code. Below are examples of using the Task MCP (sometimes in combination with other MCPs): Day-to-day data enrichment and research:- Sentiment analysis for ecommerce products
- Improving product listings for a web store
- Fact checking
- Deep research every major MCP client creating a Matrix of the results
- Reddit Sentiment analysis
- Comparing the output quality between 2 processors
- Testing and iterating on entity resolution for social media profiles
- Performing 100 deep researches and analyzing results quality
Installation
The Task MCP can be installed in any MCP client. The server URL is:https://task-mcp.parallel.ai/mcp
The Task MCP can also be used programmatically by providing your Parallel API key in the Authorization header as a Bearer token.
Cursor
Add to~/.cursor/mcp.json or .cursor/mcp.json (project-specific):
VS Code
Add tosettings.json in VS Code:
Claude Desktop / Claude.ai
Go to Settings → Connectors → Add Custom Connector, and fill in:Claude Code
Run this command in your terminal:Windsurf
Add to~/.codeium/windsurf/mcp_config.json:
Cline
Go to MCP Servers → Remote Servers → Edit Configuration:Gemini CLI
Add to~/.gemini/settings.json:
ChatGPT
WARNING: Please note that Developer Mode must be enabled, and this feature may not be available to everyone. Additionally, MCPs in ChatGPT are experimental and may not work reliably. First, go to Settings → Connectors → Advanced Settings, and turn on Developer Mode. Then, in connector settings, click Create and fill in:OpenAI Codex CLI
Add to~/.codex/config.toml:
Amp
Run this command in your terminal:Kiro
Add to.kiro/settings/mcp.json (workspace) or ~/.kiro/settings/mcp.json (global):
Google Antigravity
In the Antigravity Agent pane, click the menu (⋮) → MCP Servers → Manage MCP Servers → View raw config, then add:Best Practices
Choose enabled MCPs carefully
Be careful which tools and features you have enabled in your MCP client. When using Parallel in combination with many other tools, the increased context window may cause degraded output quality. Additionally, the LLM may prefer standard web search or deep research over Parallel if both are enabled. Turn off other web or deep-research tools, or explicitly mention that you want to use Parallel MCPs.Limit data source context size
The Task MCP is a powerful tool for batch deep research, but it is constrained by the context window size and max output tokens of the LLM. Design your prompts and tool calls to avoid overflowing these limitations, or you may experience failures, degraded performance, or lower output quality. For large datasets, use the API or other no-code integrations. The Task MCP is designed for smaller parallel tasks and experimentation.Follow up on tasks
The Task MCP only initiates Deep Research and Task Groups—it does not wait for tasks to complete. Fetch the status or results using a follow-up tool call after research is complete. The asynchronous nature allows initiating several deep researches and task groups without overflowing the context window. To perform multiple tasks or batches in a workflow, reply each time to verify the task is complete and initiate the next step.Use with larger models only
While our Web Search MCP works well with smaller models (such as GPT OSS 20B), the Task MCP requires strong reasoning capability. Use it with larger models only (such as GPT-5 or Claude Sonnet 4.5). Smaller models may result in degraded output quality.Troubleshooting
Common Installation Issues
Cline: 'Authorization Error redirect_uri must be https'
Cline: 'Authorization Error redirect_uri must be https'
Gemini CLI: Where to provide API key
Gemini CLI: Where to provide API key
Gemini CLI uses HTTP MCPs and authenticates via OAuth. If OAuth isn’t working, you can provide your API key directly.Solution: Use environment variables or the Add this to
mcp-remote proxy:~/.gemini/settings.json and replace YOUR-PARALLEL-API-KEY with your key from platform.parallel.ai.VS Code: Incorrect configuration format
VS Code: Incorrect configuration format
VS Code requires a specific configuration format. Common mistakes include using the wrong property names.Incorrect (Cursor format):Correct (VS Code format):Note: VS Code uses
mcp.servers (not mcpServers) and requires the type: "http" field.Windsurf: Configuration location and format
Windsurf: Configuration location and format
Windsurf uses a different configuration format than Cursor.Correct Windsurf configuration:Note: Windsurf uses
serverUrl instead of url. Add this to your Windsurf MCP configuration file.Connection timeout or 'server unavailable' errors
Connection timeout or 'server unavailable' errors
Tools not appearing in the IDE
Tools not appearing in the IDE
If the MCP installs but tools don’t show up:
- Restart your IDE completely (not just reload)
- Check configuration syntax: Ensure valid JSON with no trailing commas
- Verify the server URL: Must be exactly
https://task-mcp.parallel.ai/mcp - Check IDE logs: Look for MCP-related errors in your IDE’s output/debug panel