Parameters are project variables that your algorithm uses to define the value of internal variables like indicator arguments or the length of lookback windows.

Setting Values

Parameters are stored outside of your algorithm code, but we inject the values of the parameters into your algorithm when you launch an optimization job. The optimizer adjusts the value of your project parameters across a range and step size that you define to minimize or maximize an objective function. To optimize some parameters, add some parameters to your project and add the GetParameter method to your code files.

The GetParameter method returns the parameter value with the numeric type of the default value. If you don't provide a default value, the method returns a string, and you need to cast it to the data type you need. If there are no parameters in your project that match the name you pass to the method and you provide a default value to the method, it returns the default value.

// Get the parameter value and return an integer
var intParameterValue = GetParameter("<parameter-name>", 100);

// Get the parameter value as a string and cast it to an integer
var castedParameterValue = Convert.ToInt32(GetParameter("<parameter-name>"));
# Get the parameter value and return an integer
parameter_value = self.GetParameter("<parameter-name>", 100)

# Get the parameter value as a string and cast it to an integer
parameter_value = int(self.GetParameter("<parameter-name>"))


Overfitting occurs when a function is fit too closely fit to a limited set of training data. Overfitting can occur in your trading algorithms if you have many parameters or select parameters values that worked very well in the past but are sensitive to small changes in their values. In these cases, your algorithm will likely be fine-tuned to fit the detail and noise of the historical data to the extent that it negatively impacts the live performance of your algorithm. The following image shows examples of underfit, optimally-fit, and overfit functions:

Overfitting an optimization job

An algorithm that is dynamic and generalizes to new data is more likely to survive across different market conditions and apply to other markets.

Look-Ahead Bias

Look-ahead bias occurs when an algorithm makes decisions using data that would not have yet been available. For instance, in optimization jobs, you optimize a set of parameters over a historical backtesting period. After the optimizer finds the optimal parameter values, the backtest period becomes part of the in-sample data. If you run a backtest over the same period using the optimal parameters, look-ahead bias has seeped into your research. In reality, it would not be possible to know the optimal parameters during the testing period until after the testing period is over. To avoid issues with look-ahead bias, optimize on older historical data and test the optimal parameter values on recent historical data. Alternatively, apply walk forward optimization to optimize the parameters on smaller batches of history.

Number of Parameters

The cloud optimizer can optimize up to 2 parameters. There are several reasons for this quota. First, the optimizer only supports the grid search strategy, which is very inefficient. This strategy tests every permutation of parameter values, so the number of backtests that the optimization job must run explodes as you add more parameters. Second, the parameter charts that display the optimization results are limited to two dimensions. Third, if you optimize with many variables, it increases the likelihood of overfitting to historical data.

We plan to upgrade the parameter charts on the optimization results page to support 3D surface charts. When we upgrade the visualization technology and add more efficient optimization strategies, we will increase the number of parameters you can optimize in the cloud. To optimize more parameters now, run local optimizations with the CLI.

You can also see our Videos. You can also get in touch with us via Discord.

Did you find this page helpful?

Contribute to the documentation: