Overview

The Parallel MCP Server exposes the Parallel Search API to AI assistants and large-language-model (LLM) workflows, delivering high-quality, relevant results from the web while optimizing for the price-performance balance your AI applications need at scale.

Remote MCP Server

Parallel’s Search MCP server is available at https://mcp.parallel.ai/alpha/search_mcp You must have a valid API key to access the remote MCP server. If you do not have an API key, please generate one here.

Using the MCP server

To use the Parallel Search MCP server, please see your provider’s documentation for details on how to specify MCP servers in your request. The code below provides an example for OpenAI and Anthropic-compatible LLM requests.
from openai import OpenAI
from openai.types import responses as openai_responses

api_key = "your-api-key"
tools = [
    openai_responses.tool_param.Mcp(
        server_label="parallel_web_search",
        server_url="https://mcp.parallel.ai/alpha/search_mcp",
        headers={"x-api-key": api_key},
        type="mcp",
        require_approval="never",
    )
]

response = OpenAI().responses.create(
    model="gpt-4.1,
    input="What is the latest in AI research?",
    tools=tools,
    tool_choice="required",
    text={
        "format": {
            "type": "json_schema",
            "name": "output",
            "schema": output_schema,
            "strict": True,
        },
    },
)

print(response)