The Parallel MCP Server exposes the Parallel Search API to AI assistants and
large-language-model (LLM) workflows, delivering high-quality, relevant results
from the web while optimizing for the price-performance balance your AI applications need at scale.
Parallel’s Search MCP server is available at https://mcp.parallel.ai/alpha/search_mcpYou must have a valid API key to access the remote MCP server. If you do not have an API key,
please generate one here.
To use the Parallel Search MCP server, please see your provider’s documentation for details
on how to specify MCP servers in your request. The code below provides an example
for OpenAI and Anthropic-compatible LLM requests.
Copy
from openai import OpenAIfrom openai.types import responses as openai_responsesapi_key = "your-api-key"tools = [ openai_responses.tool_param.Mcp( server_label="parallel_web_search", server_url="https://mcp.parallel.ai/alpha/search_mcp", headers={"x-api-key": api_key}, type="mcp", require_approval="never", )]response = OpenAI().responses.create( model="gpt-4.1, input="What is the latest in AI research?", tools=tools, tool_choice="required", text={ "format": { "type": "json_schema", "name": "output", "schema": output_schema, "strict": True, }, },)print(response)