MCP Server

Key Concepts

Introduction

The QuantConnect Model Context Protocol (MCP) server is a bridge that connects a Large Language Model (LLM) to the QuantConnect API. With this setup, you can prompt the LLM and have it interact with our API on your behalf. This server equips the LLM with tools to create projects, run backtests, deploy live algorithms, and more. Our MCP integration can supercharge your existing workflow on QuantConnect and can even unlock quant trading to non-programmers. Simply prompt the LLM in layperson language and let it do the heavy lifting for you.

Example Conversation

The following conversation with Copilot Chat in Local Platform demonstrates how the server equips an LLM with the ability to create trading strategies, backtest them, and deploy them to live trading:

Getting Started

To start using the QC MCP Server, see the Getting Started guide for one of the supported MCP clients.

Tools

The server provides the following tools:

Projects

  • read_open_project - Read the project that's currently open.
  • update_project - Update a project's name or description.

Project Collaboration

  • create_project_collaborator - Add a collaborator to a project.
  • read_project_collaborators - List all collaborators on a project.
  • update_project_collaborator - Update collaborator information in a project.
  • delete_project_collaborator - Remove a collaborator from a project.

Project Nodes

  • read_project_nodes - Read the available and selected nodes of a project.

Compile

  • create_compile - Compile the project to check for syntax errors, compilation errors, and other issues.

Backtests

  • create_backtest - Create a new backtest request and get the backtest Id.
  • read_backtest - Read the results of a backtest.
  • list_backtests - List all the backtests for the project.
  • update_backtest - Update the name or note of a backtest.
  • delete_backtest - Delete a backtest from a project.

Optimization

  • create_optimization - Create an optimization with the specified parameters.
  • read_optimization - Read an optimization.
  • list_optimizations - List all the optimizations for a project.
  • abort_optimization - Abort an optimization.
  • delete_optimization - Delete an optimization.

Live Trading

  • create_live_algorithm - Create a live algorithm.
  • read_live_algorithm - Read details of a live algorithm.
  • stop_live_algorithm - Stop a live algorithm.
  • liquidate_live_algorithm - Liquidate and stop a live algorithm.

Prompt Ideas

The following prompts are some ideas on how you can leverage the LLM’s knowledge to improve your workflow:

> Are there any new strategies out there being discussed online?


> How do you think we can improve the Sharpe ratio of this strategy?


> Review my live algorithms. Are there any you think we should stop trading?


> Add Option hedging to my strategy.

Workflow

The MCP server is available in QuantConnect Cloud and in some local IDEs.

Cloud Workflow

A cloud workflow might look like this:

  1. Open a project in Cloud Platform.
  2. In the IDE, open the Ask Mia panel.
  3. Issue prompts to investigate new trading ideas, create projects, run backtests, analyze backtest performance, and manage live algorithms.
  4. When you're finished, close the project.

Local Workflow

A local workflow might look like this:

  1. Open Local Platform, Cursor, or Windsurf.
  2. Open a project and connect to the MCP Server.
  3. For specific instructions on each IDE, open the MCP Server documentation, click a client application, and scroll down to the Getting Started section.

  4. In the chat window, issue prompts to investigate new trading ideas, create projects, run backtests, analyze backtest performance, and manage live algorithms.
  5. When you're finished, close the project.

Quotas

There are no quotas on the QuantConnect API, but the client you use may have quotas. For more information about the quotas, see the Quotas section for one of the supported MCP clients.

Troubleshooting

The following sections explain some issues you may encounter and how to resolve them.

Service Outages

The MCP server relies on the QuantConnect API and the client application. To check the status of the QuantConnect API, see our Status page. To check the status of your client and the LLM, see their status page.

Examples

The following examples demonstrate the MCP server.

Example 1: Hello World

To test the server and client are working and connected, enter the following prompt into the client application:

> Read the open project.

The agent should call the read_open_project tool.

Example 2: Strategy Development and Deployment

This example uses Copilot Chat in Local Platform to brainstorm new strategy ideas, edit files, run backtests, and deploy to paper trading.

You can also see our Videos. You can also get in touch with us via Discord.

Did you find this page helpful?

Contribute to the documentation: